2023
DOI: 10.3390/s23042133
|View full text |Cite
|
Sign up to set email alerts
|

Comparison of Information Criteria for Detection of Useful Signals in Noisy Environments

Abstract: This paper considers the appearance of indications of useful acoustic signals in the signal/noise mixture. Various information characteristics (information entropy, Jensen–Shannon divergence, spectral information divergence and statistical complexity) are investigated in the context of solving this problem. Both time and frequency domains are studied for the calculation of information entropy. The effectiveness of statistical complexity is shown in comparison with other information metrics for different signal… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(1 citation statement)
references
References 25 publications
0
1
0
Order By: Relevance
“…Information entropy is used as a measure of uncertainty, that is, the probability of discrete random events occurring. Briefly , the more chaotic things are, the higher the information entropy, and conversely, the lower the information entropy [29,30,31,32,33,34].…”
Section: Information Entropy and Ssimmentioning
confidence: 99%
“…Information entropy is used as a measure of uncertainty, that is, the probability of discrete random events occurring. Briefly , the more chaotic things are, the higher the information entropy, and conversely, the lower the information entropy [29,30,31,32,33,34].…”
Section: Information Entropy and Ssimmentioning
confidence: 99%