2008 42nd Asilomar Conference on Signals, Systems and Computers 2008
DOI: 10.1109/acssc.2008.5074715
|View full text |Cite
|
Sign up to set email alerts
|

An overview of Renyi Entropy and some potential applications

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
12
0

Year Published

2010
2010
2022
2022

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 21 publications
(12 citation statements)
references
References 19 publications
0
12
0
Order By: Relevance
“…The system diagram is shown in Figure 6.4. Figure 6.5 shows the training images (22,21,19 images per class, respectively) to illustrate the difficulty of the task.…”
Section: Case Study: Automatic Target Recognition (Atr) With Itlmentioning
confidence: 99%
See 1 more Smart Citation
“…The system diagram is shown in Figure 6.4. Figure 6.5 shows the training images (22,21,19 images per class, respectively) to illustrate the difficulty of the task.…”
Section: Case Study: Automatic Target Recognition (Atr) With Itlmentioning
confidence: 99%
“…(11.4)), but for values in the range given by the Silverman's rule different information is contained in CSD, because it is in fact using a different similarity metric. This may have applications in spectral estimation and detection theories, however, we and others [22] have just started to explore this relation.…”
Section: Definition the Generalized Covariance Function Called The mentioning
confidence: 99%
“…It' s well know that entropy can be used to measure the average amount of information conveyed by random variable and it has been applied in image processing, coding theory, finance decisions and other various fields [4]. Keeping this idea in mind, our work try to find the relation between entropy and similarity and then build an alternative similarity measure based on Renyi's quadratic entropy for the audio signal processing.…”
Section: Introductionmentioning
confidence: 98%
“…For instance, the more delocalized is the electron distribution, the larger is the Shannon entropy (S) [3][4][5][6][7][8][9][10][11][12][13][14]. Moreover, Rényi entropy of order α (R α ) [15][16][17] and Tsallis entropy of order α (T α ) [18][19][20][21] are the one-parameter generalizations of Shannon entropy and offer extended descriptions. In addition, Onicescu Information Energy of order α (E O α ) [22,23] is related to the frequency moment of the electron density [24][25][26].…”
Section: Introductionmentioning
confidence: 99%