Proceedings of the 10th International Conference on Operations Research and Enterprise Systems 2021
DOI: 10.5220/0010206801530160
|View full text |Cite
|
Sign up to set email alerts
|

The Impact of Information Geometry on the Analysis of the Stable M/G/1 Queue Manifold

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
34
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 17 publications
(34 citation statements)
references
References 0 publications
0
34
0
Order By: Relevance
“…(iv) deepening the differential geometric study of the previous new metrics, and finding new geometric invariants for the modeling and the control of statistical phenomena. For example, the study of the distance related topics (see [ 35 ]), of the geodesics (see [ 36 , 37 , 38 ]) and of the various types of curvature tensor fields (see [ 10 , 39 , 40 , 41 , 42 ] for samples of the needed geometric tools).…”
Section: Discussionmentioning
confidence: 99%
“…(iv) deepening the differential geometric study of the previous new metrics, and finding new geometric invariants for the modeling and the control of statistical phenomena. For example, the study of the distance related topics (see [ 35 ]), of the geodesics (see [ 36 , 37 , 38 ]) and of the various types of curvature tensor fields (see [ 10 , 39 , 40 , 41 , 42 ] for samples of the needed geometric tools).…”
Section: Discussionmentioning
confidence: 99%
“…Shannon entropy is not the best descriptor [8] of a time series' statistical variations, which has made using other unique information theoretic notions, including Fisher Information [9][10][11], highly motivating. Differential entropy [12], Kullback-Leibler divergence (KLD) [13], or information length (IL) [14][15][16]. Time-dependent PDFs provide the ability to trace time series evolution and measure variability [8], which is the basis of the IL metric's attractiveness.…”
Section: Information Length Theorymentioning
confidence: 99%
“…Considering equation ( 5), it is clear that 𝜀(𝑡) is dependent on the changes in both mean and variance defined by the corresponding dynamics of equation ( 4), portraying 𝑝(𝑥, 𝑡) threedimensional space variations, namely (𝑡, 𝑥, 𝑝(𝑥, 𝑡)) as in Figure 3(c.f., [8]), where the variation of the information velocity, √(𝜀(𝑡) would occur along the path starting initially from the state probability density function 𝑝(𝑥, 𝑡 ) to the final state at 𝑝(𝑥, 𝑡 ) as a descriptor of the speed limit from the statistical deviations of the observables [16]. Essentially, we might not be able to see the temporal statistical fluctuations that are occurring when we use differential entropy [12]. This directly follows from the locality's insufficiency because differential entropy primarily measures the differences between any two given PDFs while ignoring any intermediate states [17].…”
Section: Il As a Conceptmentioning
confidence: 99%
“…[2][3][4][5][6][7][8][9][10]. KLD [11][12][13][14][15][16] is a method used to compare two probability distributions. In Probability and Statistics, when we need to simplify complex distributions or approximate observed data, KL Divergence helps us quantify the amount of information lost in the process of choosing an approximation.…”
Section: Introductionmentioning
confidence: 99%