2019
DOI: 10.3390/e21080720
|View full text |Cite
|
Sign up to set email alerts
|

Empirical Estimation of Information Measures: A Literature Guide

Abstract: We give a brief survey of the literature on the empirical estimation of entropy, differential entropy, relative entropy, mutual information and related information measures. While those quantities are of central importance in information theory, universal algorithms for their estimation are increasingly important in data science, machine learning, biology, neuroscience, economics, language, and other experimental sciences.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
47
0

Year Published

2020
2020
2025
2025

Publication Types

Select...
10

Relationship

0
10

Authors

Journals

citations
Cited by 55 publications
(47 citation statements)
references
References 111 publications
0
47
0
Order By: Relevance
“…To analyze the relationship between the prediction errors of classical forecasting methods such as ARIMA, we build a feature space [ 18 ] based on Shannon entropy ( H ) features that presumably can be used to identify those TS instances where ARIMA forecasting errors are expected to be higher or lower, accordingly. These features are based on four entropy-based complexity measures, namely the frequentist binning approach ( ) [ 22 ]; 2-Regimes ( ) [ 19 ] and Permutation ( ) entropy [ 23 ]. These three built upon the notions of symbolic dynamics, and the Spectral entropy ( ) [ 24 ] based on the analysis of the spectrum of a time series.…”
Section: Methodsmentioning
confidence: 99%
“…To analyze the relationship between the prediction errors of classical forecasting methods such as ARIMA, we build a feature space [ 18 ] based on Shannon entropy ( H ) features that presumably can be used to identify those TS instances where ARIMA forecasting errors are expected to be higher or lower, accordingly. These features are based on four entropy-based complexity measures, namely the frequentist binning approach ( ) [ 22 ]; 2-Regimes ( ) [ 19 ] and Permutation ( ) entropy [ 23 ]. These three built upon the notions of symbolic dynamics, and the Spectral entropy ( ) [ 24 ] based on the analysis of the spectrum of a time series.…”
Section: Methodsmentioning
confidence: 99%
“…Multi-information as total dependence. Another important information measure that we will use in several ways is the multi-information for n variables (originally defined and called ''total correlation,'' by Watanabe (1960) and discussed and used by many others (Ting, 1962;Han, 1980)). It is defined as the difference between the sum of entropies of each variable separately and the joint entropy of all the variables together:…”
Section: Genetic Dependence Relationsmentioning
confidence: 99%
“…In general, joint entropy can be expressed in terms of conditional entropy , as where and denotes conditional probability. When the probability distributions are not known, it is not possible to calculate the exact value of and it is necessary to calculate an estimator from a sample [ 18 , 19 , 20 ].…”
Section: Preliminariesmentioning
confidence: 99%