Inferring causal relations from experimental observations is of primal importance in science. Instrumental tests provide an essential tool for that aim, as they allow to estimate causal dependencies even in the presence of unobserved common causes. In view of Bell's theorem, which implies that quantum mechanics is incompatible with our most basic notions of causality, it is of utmost importance to understand whether and how paradigmatic causal tools obtained in a classical setting can be carried over to the quantum realm. Here we show that quantum effects imply radically different predictions in the instrumental scenario. Among other results, we show that an instrumental test can be violated by entangled quantum states. Furthermore, we demonstrate such violation using a photonic setup with active feed-forward of information, thus providing an experimental proof of this new form of non-classical behavior. Our findings have fundamental implications in causal inference and may also lead to new applications of quantum technologies.Instrumental variables were originally invented to estimate parameters in econometric models of supply and demand [1] and since then have found a wide range of applications in various other fields [2, 3]. Remarkably, an instrument allows one to estimate the strength of causal influences between two variables solely from observed data [4, 5], without any assumptions on the functional dependence among them. This is the approach known in quantum information science as "deviceindependent" [6]. For that, an instrumental test is crucial, since it provides empirically testable inequalities allowing one to check whether one has a valid instrument [7].Instrumental inequalities as well as the estimation of causal dependencies are derived from classical notions of cause and effect that, since Bell's theorem [8], we know cannot be taken for granted in quantum phenomena. Given this mismatch between classical and quantum predictions, it is natural to ask how fundamental tools in causal inference behave in a quantum scenario. This has motivated the emerging field of quantum causal modeling [9][10][11][12][13][14][15][16][17], which has provided sophisticated generalizations of the classical theory of causality [5] to the quantum realm, thereby discovering, for example, exciting quantum advantages for causal inference [18][19][20]. Within this new framework, it was shown [13] that a paradigmatic class of instrumental inequalities [7] are satisfied by quantum mechanics. However, it is not known whether other instrumental inequalities may admit quantum violations. Moreover, even if a given observed statistics is compatible with a classical instrumental causal model, it may well still be the case that quantum effects do offer some sort of enhancement.In this article, we show that the quantum predictions for the instrumental scenario are radically different from those of classical causality theory. Firstly, we show that a standard measure of causation -the average causal effect (ACE) [4, 5, 21] -can be largely...
The difficulty of validating large-scale quantum devices, such as Boson Samplers, poses a major challenge for any research program that aims to show quantum advantages over classical hardware. To address this problem, we propose a novel data-driven approach wherein models are trained to identify common pathologies using unsupervised machine learning methods. We illustrate this idea by training a classifier that exploits K-means clustering to distinguish between Boson Samplers that use indistinguishable photons from those that do not. We train the model on numerical simulations of small-scale Boson Samplers and then validate the pattern recognition technique on larger numerical simulations as well as on photonic chips in both traditional Boson Sampling and scattershot experiments. The effectiveness of such method relies on particle-type-dependent internal correlations present in the output distributions. This approach performs substantially better on the test data than previous methods and underscores the ability to further generalize its operation beyond the scope of the examples that it was trained on.Introduction -There has been a flurry of interest in quantum science and technology in recent years that has been focused on the transformative potential that quantum computers have for cryptographic tasks [1], machine learning [2, 3] and quantum simulation [4,5]. While existing quantum computers fall short of challenging their classical brethren for these tasks, a different goal has emerged that existing quantum devices could address: namely, testing the Church-Turing thesis. The (extended) Church-Turing thesis is a widely held belief that asserts that every physically reasonable model of computing can be efficiently simulated using a probabilistic Turing machine. This statement is, of course, controversial since, if it were true, then quantum computing would never be able to provide exponential advantages over classical computing. Consequently, providing evidence that the extended Church-Turing thesis is wrong is more philosophically important than the ultimate goal of building a quantum computer.Various schemes have been proposed in the last few years [6-11] that promise to be able to provide evidence of a quantum computational supremacy, namely the regime where a quantum device starts outperforming its classical counterpart in a specific task. A significant step in this direction has been achieved in particular by Aaronson and Arkhipov [6] with the formal definition of a dedicated task known as Boson * fabio.sciarrino@uniroma1.it Sampling. This is a computational problem that consists in sampling from the output distribution of N indistinguishable bosons evolved through a linear unitary transformation. This problem has been shown to be classically intractable (even approximately) under mild complexity theoretic assumptions. Indeed, the existence of a classical efficient algorithm to perform Boson Sampling would imply the collapse of the polynomial hierarchy to the third level [6]. Such a collapse is viewed amon...
The launch of a satellite capable of distributing entanglement through long distances and the first loophole-free violation of Bell inequalities are milestones indicating a clear path for the establishment of quantum networks. However, nonlocality in networks with independent entanglement sources has only been experimentally verified in simple tripartite networks, via the violation of bilocality inequalities. Here, by using a scalable photonic platform, we implement star-shaped quantum networks consisting of up to five distant nodes and four independent entanglement sources. We exploit this platform to violate the chained n-locality inequality and thus witness, in a device-independent way, the emergence of nonlocal correlations among the nodes of the implemented networks. These results open new perspectives for quantum information processing applications in the relevant regime where the observed correlations are compatible with standard local hidden variable models but are nonclassical if the independence of the sources is taken into account.
The number of parameters describing a quantum state is well known to grow exponentially with the number of particles. This scaling limits our ability to characterize and simulate the evolution of arbitrary states to systems, with no more than a few qubits. However, from a computational learning theory perspective, it can be shown that quantum states can be approximately learned using a number of measurements growing linearly with the number of qubits. Here, we experimentally demonstrate this linear scaling in optical systems with up to 6 qubits. Our results highlight the power of the computational learning theory to investigate quantum information, provide the first experimental demonstration that quantum states can be “probably approximately learned” with access to a number of copies of the state that scales linearly with the number of qubits, and pave the way to probing quantum states at new, larger scales.
Wave-particle duality has long been considered a fundamental signature of the non-classical behavior of quantum phenomena, specially in a delayed choice experiment (DCE), where the experimental setup revealing either the particle or wave nature of the system is decided after the system has entered the apparatus. However, as counter-intuitive as it might seem, usual DCEs do have a simple causal explanation. Here, we take a different route and under a natural assumption about the dimensionality of the system under test, we present an experimental proof of the non-classicality of a DCE based on the violation of a dimension witness inequality. Our conclusion is reached in a device-independent and loophole-free manner, that is, based solely on the observed data and without the need on any assumptions about the measurement apparatus.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.