2018
DOI: 10.3390/e20070498
|View full text |Cite
|
Sign up to set email alerts
|

Assessing Information Transmission in Data Transformations with the Channel Multivariate Entropy Triangle

Abstract: Data transformation, e.g., feature transformation and selection, is an integral part of any machine learning procedure. In this paper, we introduce an information-theoretic model and tools to assess the quality of data transformations in machine learning tasks. In an unsupervised fashion, we analyze the transformation of a discrete, multivariate source of information X into a discrete, multivariate sink of information Y related by a distribution P XY . The first contribution is a decomposition of the maximal p… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 28 publications
0
2
0
Order By: Relevance
“…Abdallah and Plumbley also proposed an interesting non-negative decomposition, named the binding information (definition 23 [ 34 ]). To quantify and represent the transfer of information from a multivariate source to a multivariate sink of information, Valverde-Albacete and Peláez-Moreno defined a 2-simplex in the multivariate entropic space to represent the information balance of the multivariate transformation [ 35 , 36 ].…”
Section: Introductionmentioning
confidence: 99%
“…Abdallah and Plumbley also proposed an interesting non-negative decomposition, named the binding information (definition 23 [ 34 ]). To quantify and represent the transfer of information from a multivariate source to a multivariate sink of information, Valverde-Albacete and Peláez-Moreno defined a 2-simplex in the multivariate entropic space to represent the information balance of the multivariate transformation [ 35 , 36 ].…”
Section: Introductionmentioning
confidence: 99%
“…Bukovsky et al [ 6 ] discuss and further develop the recently introduced concept of learning entropy (LE) as a learning-based information measure, which is targeted at real-time novelty detection based on unusual learning efforts. For assessing the quality of data transformations in machine learning, Valverde-Albacete et al [ 7 ] introduce an information-theoretic tool. They analyze performance of the tool for different types of data transformation, among them principal component analysis and independent component analysis.…”
mentioning
confidence: 99%