2018
DOI: 10.1016/j.physa.2018.06.085
|View full text |Cite
|
Sign up to set email alerts
|

Information measure for financial time series: Quantifying short-term market heterogeneity

Abstract: A well-interpretable measure of information has been recently proposed based on a partition obtained by intersecting a random sequence with its moving average. The partition yields disjoint sets of the sequence, which are then ranked according to their size to form a probability distribution function and finally fed in the expression of the Shannon entropy. In this work, such entropy measure is implemented on the time series of prices and volatilities of six financial markets. The analysis has been performed, … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 25 publications
(23 citation statements)
references
References 50 publications
0
23
0
Order By: Relevance
“…The convexity of divergence measure J n (A, B) is an additional attractive feature of the Shannon entropy H(G) as a measure of diversity of a distribution. The application of measure of diversity is discussed in [19,20]. In this study, we assume that the Jensen difference Equation 3arising from a generalized class of entropy functions including the exponential entropy due to Pal and Pal [21], which is called an exponential J-divergence, and examine its convexity.…”
Section: Picture Fuzzy Divergence Measurementioning
confidence: 99%
“…The convexity of divergence measure J n (A, B) is an additional attractive feature of the Shannon entropy H(G) as a measure of diversity of a distribution. The application of measure of diversity is discussed in [19,20]. In this study, we assume that the Jensen difference Equation 3arising from a generalized class of entropy functions including the exponential entropy due to Pal and Pal [21], which is called an exponential J-divergence, and examine its convexity.…”
Section: Picture Fuzzy Divergence Measurementioning
confidence: 99%
“…The partitioning of a relevant sequence into disjoint sets was usually performed based on the uniform division of the block sizes. The estimation of entropy for the subsequent partitions was performed for the different block sizes correspondingly [38,39].…”
Section: Feature Extraction Based On Shannon Entropy and Logarithmic mentioning
confidence: 99%
“…The entropy of a random variable indicates the required average amount of information to describe the random variable [33] which has been adopted in many studies [34,35]. The entropy of a discrete random variable X = (x 1 , x 2 , .…”
Section: The Optimal Selection Of Candidate Input Variables Using Mrmmentioning
confidence: 99%