The superposition principle lies at the heart of many non-classical properties of quantum mechanics. Motivated by this, we introduce a rigorous resource theory framework for the quantification of superposition of a finite number of linear independent states. This theory is a generalization of resource theories of coherence. We determine the general structure of operations which do not create superposition, find a fundamental connection to unambiguous state discrimination, and propose several quantitative superposition measures. Using this theory, we show that trace decreasing operations can be completed for free which, when specialised to the theory of coherence, resolves an outstanding open question and is used to address the free probabilistic transformation between pure states. Finally, we prove that linearly independent superposition is a necessary and sufficient condition for the faithful creation of entanglement in discrete settings, establishing a strong structural connection between our theory of superposition and entanglement theory.Introduction. -During the last decades, there has been an increasing interest in quantum technologies. The main reason for this is the operational advantages of protocols or devices working in the quantum regime over those relying on classical physics. Early examples include entanglement-based quantum cryptography [1], quantum dense coding [2] and quantum teleportation [3], where entanglement is a resource which is consumed and manipulated. Therefore the detection, manipulation and quantification of entanglement was investigated, leading to the resource theory of entanglement [4]. Typical quantum resource theories (QRTs) are built by imposing an additional restriction to the laws of quantum mechanics [5][6][7]. In the case of entanglement theory, this is the restriction to local operations and classical communication (LOCC). From such a restriction, the two main ingredients of QRTs emerge: The free operations and the free states (which are LOCC and separable states in the case of entanglement theory). All states which are not free contain the resource under investigation and are considered costly. Therefore free operations must transform free states to free states, allowing for the resource to be manipulated but not freely created. Once these main ingredients are defined, a resource theory investigates the manipulation, detection, quantification and usage of the resource.In principle, not only entanglement but every property of quantum mechanics not present in classical physics could lead to an operational advantage [8,9]. This motivates the considerable interest in the rigorous quantification of nonclassicality [10][11][12][13][14][15]. The superposition principle underlies many non-classical properties of quantum mechanics including entanglement or coherence. Recently resource theories of coherence [11,16,17] and their role in fields as diverse as quantum computation [8,18,19], quantum phase discrimination [20] and quantum thermodynamics [21] attracted considerable attenti...
The use of the von Neumann entropy in formulating the laws of thermodynamics has recently been challenged. It is associated with the average work whereas the work guaranteed to be extracted in any single run of an experiment is the more interesting quantity in general. We show that an expression that quantifies majorization determines the optimal guaranteed work. We argue it should therefore be the central quantity of statistical mechanics, rather than the von Neumann entropy. In the limit of many identical and independent subsystems (asymptotic i.i.d) the von Neumann entropy expressions are recovered but in the non-equilbrium regime the optimal guaranteed work can be radically different to the optimal average. Moreover our measure of majorization governs which evolutions can be realized via thermal interactions, whereas the non-decrease of the von Neumann entropy is not sufficiently restrictive. Our results are inspired by single-shot information theory.Statistical mechanics is a corner-stone of modern physics. Many of its basic paradigms and mathematical methods were set in an era where the experimental abilities were much more limited and modern information theory not developed. Accordingly there is currently significant momentum in investigating the theory's foundations in the quantum and nano regimes, see e.g. al (2012) to mention but a few recent contributions. We here derive an alternative type of statistical mechanics from scratch. Our approach is inspired by recent results in information theory (Renner and Wolf 2004, Renner 2005) and builds on (Dahlsten et al 2011, Rio et al 2011, Aberg 2012, Horodecki and Oppenheim 2013).We argue this approach is both significantly more general than the standard theory and addresses questions more relevant to modern experiments.It is more general in that we will not assume that the states of systems of interest are thermal, but rather just that there is a heat bath which when interacting with a system gradually takes that system towards a thermal state. Thus the system of interest is not necessarily in equilibrium. In fact we will allow for any probability distribution over energy levels. We do in particular not assume that the system under consideration is large or that internal correlations are negligible. This makes the approach significantly more relevant to modern experiments where small subsystems can be addressed individually and in time-scales faster than the thermalization time.A key difference regarding which questions are addressed is that we focus not on averages of distributions as in standard statistical mechanics. Instead we ask, for any given single run of an experiment, which threshold values are guaranteed to be exceeded, or more generally guaranteed to be exceeded up to some probability ε, not necessarily small. This is referred to as the single-shot paradigm, as opposed to the average paradigm. This distinction is important when distributions of quantities have a significant spread around the average, as is often the case for small systems.
To describe certain facets of non-classicality, it is necessary to quantify properties of operations instead of states. This is the case if one wants to quantify how well an operation detects non-classicality, which is a necessary prerequisite for its use in quantum technologies. To do so rigorously, we build resource theories on the level of operations, exploiting the concept of resource destroying maps. We discuss the two basic ingredients of these resource theories, the free operations and the free super-operations, which are sequential and parallel concatenations with free operations. This leads to defining properties of functionals that are well suited to quantify the resources of operations. We introduce these concepts at the example of coherence. In particular, we present two measures quantifying the ability of an operation to detect, i.e. to use, coherence, one of them with an operational interpretation, and provide methods to evaluate them. arXiv:1806.07332v2 [quant-ph]
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.