2021
DOI: 10.3390/e23111496
|View full text |Cite
|
Sign up to set email alerts
|

Entropy Measures for Data Analysis II: Theory, Algorithms and Applications

Abstract: Entropies and entropy-like quantities are playing an increasing role in modern non-linear data analysis [...]

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 7 publications
0
1
0
Order By: Relevance
“…Relative entropy, also known as Kullback-Leibler divergence or KL divergence, is a concept used to measure the difference between two probability distributions. In the study of entropy, scholars such as Clausius, Boltzmann, Gibbs, and Shannon have provided their own interpretations of entropy, and these definitions of entropy are essentially consistent [41]. At present, various studies on entropy describe it as a measure of disorder [42][43][44], where the entropy is minimal when a sample possesses a specific characteristic and increases as the deviation from that characteristic grows.…”
Section: Feature Selection For Block Vectorsmentioning
confidence: 99%
“…Relative entropy, also known as Kullback-Leibler divergence or KL divergence, is a concept used to measure the difference between two probability distributions. In the study of entropy, scholars such as Clausius, Boltzmann, Gibbs, and Shannon have provided their own interpretations of entropy, and these definitions of entropy are essentially consistent [41]. At present, various studies on entropy describe it as a measure of disorder [42][43][44], where the entropy is minimal when a sample possesses a specific characteristic and increases as the deviation from that characteristic grows.…”
Section: Feature Selection For Block Vectorsmentioning
confidence: 99%