2020
DOI: 10.1016/j.physa.2019.123016
|View full text |Cite
|
Sign up to set email alerts
|

Tsallis conditional mutual information in investigating long range correlation in symbol sequences

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 46 publications
0
4
0
Order By: Relevance
“…Recently, the Tsallis mutual information came into the focus for studying long range correlations in symbol sequences [70]. It is related to the Tsallis α-entropy…”
Section: Tsallis α-Entropy and Related Mutual Information Functionsmentioning
confidence: 99%
See 1 more Smart Citation
“…Recently, the Tsallis mutual information came into the focus for studying long range correlations in symbol sequences [70]. It is related to the Tsallis α-entropy…”
Section: Tsallis α-Entropy and Related Mutual Information Functionsmentioning
confidence: 99%
“…These mutual information concepts can be used to generate information theoretic features for sequence analysis: Rényi entropic profiles were considered for DNA classification problems based on chaos game representation [67,68]. Molecular descriptors based on the Rényi entropy were investigated in [69], whereas long range correlation using Tsallis mutual information was considered in [70]. However, to our best knowledge, MIF for these variants are not known so far.…”
Section: Introductionmentioning
confidence: 99%
“…In information theory, the mutual information (MI) is a measure of the mutual dependence between the two time series [35,36]. Specifically, MI quantifies the random dependency between two stochastic variables without making any hypothesis about the nature of the relationship [37].…”
Section: Mutual Informationmentioning
confidence: 99%
“…More specifically, Melnik and Usatenko (2014) , using an additive Markov chain approach, analyzed DNA molecules of different organisms, and they estimated the differential entropy for the biological classification of these organisms. Similarly, Papapetrou and Kugiumtzis (2014 , 2020) studied DNA sequences, through the estimation of the Markov chain orders and Tsallis conditional mutual information. The results showed a different long memory structure in their DNA samples (coding and non-coding).…”
Section: Introductionmentioning
confidence: 99%