2019
DOI: 10.1109/tpami.2019.2932976
|View full text |Cite
|
Sign up to set email alerts
|

Multivariate Extension of Matrix-based Renyi's α-order Entropy Functional

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
87
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7

Relationship

3
4

Authors

Journals

citations
Cited by 64 publications
(89 citation statements)
references
References 21 publications
2
87
0
Order By: Relevance
“…Moreover, it is worth noting that, the proposed methodologies can be simply applied to other DNN architectures, e.g., CNN [16]. The only difference is that we need to use the multivariate extension of matrix-based Renyi's entropy functional [78] to quantify the information flow of CNN.…”
Section: Discussionmentioning
confidence: 99%
“…Moreover, it is worth noting that, the proposed methodologies can be simply applied to other DNN architectures, e.g., CNN [16]. The only difference is that we need to use the multivariate extension of matrix-based Renyi's entropy functional [78] to quantify the information flow of CNN.…”
Section: Discussionmentioning
confidence: 99%
“…In this section, we start with a brief introduction to the recently proposed matrix-based Rényi’s -entropy functional and its multivariate extension [ 13 ]. The novel definition yields two simple stopping criteria as presented below.…”
Section: Simple Stopping Criteria For Information Theoretic Featurmentioning
confidence: 99%
“…Unfortunately, the accurate PDF estimation impedes its more widespread adoption in data driven science. To solve this problem, References [ 13 , 23 ] suggest similar quantities that resemble quantum Rényi’s entropy [ 27 ] in terms of the normalized eigenspectrum of the Hermitian matrix of the projected data in RKHS, thus estimating the entropy and joint entropy among two or multiple variables directly from data without PDF estimation. For brevity, we directly give the definition.…”
Section: Simple Stopping Criteria For Information Theoretic Featurmentioning
confidence: 99%
See 2 more Smart Citations