2011
DOI: 10.1103/physreva.84.034101
|View full text |Cite
|
Sign up to set email alerts
|

Effect of fluctuation measures on the uncertainty relations between two observables: Different measures lead to opposite conclusions

Abstract: We show within a very simple framework that different measures of fluctuations lead to uncertainty relations resulting in contradictory conclusions. More specifically we focus on Tsallis and Rényi entropic uncertainty relations and we get that the minimum joint uncertainty states for some fluctuation measures are the maximum joint uncertainty states of other fluctuation measures, and vice versa.

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

5
22
0
2

Year Published

2013
2013
2017
2017

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 24 publications
(29 citation statements)
references
References 41 publications
5
22
0
2
Order By: Relevance
“…(2) doesn't depend on the state to be measured. Recently, some works that improve the lower bound have been presented by many authors such as Uffink [17], Coles and Piani [18], Lukasz Rudnicki [19] and so on, some works based on different entropies including smooth entropy [20], K-entropy [21], Rényi entropy [22,23,24,25,26,27], collision entropy [28,29], Tsallis entropy et al [30,31,32,33,34,35] have been presented, and some works related to different measurements have also been provided for the entropy uncertainty relations [36,37,38]. It is worthy of being noted that some interesting results were presented for the uncertainty relation in two-dimensional Hilbert space [39,40,41,42].…”
Section: Introductionmentioning
confidence: 99%
“…(2) doesn't depend on the state to be measured. Recently, some works that improve the lower bound have been presented by many authors such as Uffink [17], Coles and Piani [18], Lukasz Rudnicki [19] and so on, some works based on different entropies including smooth entropy [20], K-entropy [21], Rényi entropy [22,23,24,25,26,27], collision entropy [28,29], Tsallis entropy et al [30,31,32,33,34,35] have been presented, and some works related to different measurements have also been provided for the entropy uncertainty relations [36,37,38]. It is worthy of being noted that some interesting results were presented for the uncertainty relation in two-dimensional Hilbert space [39,40,41,42].…”
Section: Introductionmentioning
confidence: 99%
“…As an example, there exist variables with infinite variance [4], so that the second-order moment is not always convenient for describing the dispersion of a random variable. Moreover, in the case of discrete-spectrum observables, there is no universal nontrivial lower bound, and thus Heisenberg-like inequalities do not quantify the UP [5,6,7]. In order to overcome the potential inadequacy of the variance-based expression of the UP, many formulations based on other measures of dispersion have been proposed, for instance issued from information theory [8,9,10].…”
Section: Introductionmentioning
confidence: 99%
“…[11] is a quite interesting formulation particularly suited to phase-angle variables. We also show that this encounters fundamental ambiguities when contrasting different slightly different alternative implementations, as it also holds for other approaches [12][13][14].…”
Section: Introductionmentioning
confidence: 58%
“…Turning our attention to the alternative product of characteristic functions in Eq. (12) we get that the minimum uncertainty states are those pure states with s y = 0 and |s x | = |s z | = 1/ √ 2. On the other hand, the states with s y = 0 and |s x | = 0 or |s z | = 0 are of maximum uncertainty, contrary to the predictions of the sum relations (10) and (13).…”
Section: A Example: Qubitmentioning
confidence: 86%