We propose a novel dynamical method for beating decoherence and dissipation in open quantum systems. We demonstrate the possibility of filtering out the effects of unwanted (not necessarily known) system-environment interactions and show that the noise-suppression procedure can be combined with the capability of retaining control over the effective dynamical evolution of the open quantum system. Implications for quantum information processing are discussed.Comment: 4 pages, no figures; Plain ReVTeX. Final version to appear in Physical Review Letter
Quantum Error Correction will be necessary for preserving coherent states against noise and other unwanted interactions in quantum computation and communication. We develop a general theory of quantum error correction based on encoding states into larger Hilbert spaces subject to known interactions. We obtain necessary and sufficient conditions for the perfect recovery of an encoded state after its degradation by an interaction. The conditions depend only on the behavior of the logical states. We use them to give a recovery operator independent definition of error-correcting codes. We relate this definition to four others: The existence of a left inverse of the interaction, an explicit representation of the error syndrome using tensor products, perfect recovery of the completely entangled state, and an information theoretic identity. Two notions of fidelity and error for imperfect recovery are introduced, one for pure and the other for entangled states. The latter is more appropriate when using codes in a quantum memory or in applications of quantum teleportation to communication. We show that the error for entangled states is bounded linearly by the error for pure states. A formal definition of independent interactions for qubits is given. This leads to lower bounds on the number of qubits required to correct e errors and a formal proof that the classical bounds on the probability of error of e-error-correcting codes applies to e-errorcorrecting quantum codes, provided that the interaction is dominated by an identity component. *
In standard quantum computation, the initial state is pure and the answer is determined by making a measurement of some of the bits in the computational basis. What can be accomplished if the initial state is a highly mixed state and the answer is determined by measuring the expectation of σz on the first bit with bounded sensitivity? This is the situation in high temperature ensemble quantum computation. We show that in this model it is possible to perform interesting physics simulations which have no known efficient classical algorithms, even though the model is less powerful then standard quantum computing in the presence of oracles.Recent discoveries show that quantum computers can solve problems of practical interest much faster than known algorithms for classical computers [1,2]. This has lead to widespread recognition of the potential benefits of quantum computation. Where does the apparent power of quantum computers come from? This power is frequently attributed to "quantum parallelism" , interference phenomena derived from the superposition principle, and the ability to prepare and control pure states according to the Schrödinger equation. Real quantum computers are rarely in pure states and interact with their environments, which leads to non-unitary evolution. Furthermore, recent proposals and experiments using NMR at high temperature to study quantum computation involve manipulations of extremely mixed states. Recent research in error-correction and fault-tolerant computation has shown that non-unitary evolution due to weak interactions with the environment results in no loss of computational power, if sufficiently pure states can be prepared [3][4][5][6]. Here we consider the situation where there are no errors or interactions with the environment, but the initial state is highly mixed. We investigate the power of one bit of quantum information available for computing, by which we mean that the input state is equivalent to having one bit in a pure state and arbitrarily many additional bits in a completely random state. The model of computation which consists of a classical computer with access to a state of this form is called deterministic quantum computation with one quantum bit (DQC1). We demonstrate that in the presence of oracles, such a computer is less powerful than one with access to pure state bits. However, it can solve problems related to physics simulations for which no efficient classical algorithms are known. DQC1 is the first non-trivial entry in the class of models of computations which are between classical computation and standard quantum computation. Investigations of such models are expected to lead to a better understanding of the reasons for the power of quantum computation.There are many kinds of problems that one might like to solve using a computational device. The three main problems not involving communication are function evaluation, non-deterministic function evaluation and distribution sampling. Let S be the set of all binary strings and S n the set of binary strings of...
We present a loophole-free violation of local realism using entangled photon pairs. We ensure that all relevant events in our Bell test are spacelike separated by placing the parties far enough apart and by using fast random number generators and high-speed polarization measurements. A high-quality polarization-entangled source of photons, combined with high-efficiency, low-noise, single-photon detectors, allows us to make measurements without requiring any fair-sampling assumptions. Using a hypothesis test, we compute p-values as small as 5.9 × 10−9 for our Bell violation while maintaining the spacelike separation of our events. We estimate the degree to which a local realistic system could predict our measurement choices. Accounting for this predictability, our smallest adjusted p-value is 2.3 × 10−7. We therefore reject the hypothesis that local realism governs our experiment.
A key requirement for scalable quantum computing is that elementary quantum gates can be implemented with sufficiently low error. One method for determining the error behavior of a gate implementation is to perform process tomography. However, standard process tomography is limited by errors in state preparation, measurement and one-qubit gates. It suffers from inefficient scaling with number of qubits and does not detect adverse error-compounding when gates are composed in long sequences. An additional problem is due to the fact that desirable error probabilities for scalable quantum computing are of the order of 0.0001 or lower. Experimentally proving such low errors is challenging. We describe a randomized benchmarking method that yields estimates of the computationally relevant errors without relying on accurate state preparation and measurement. Since it involves long sequences of randomly chosen gates, it also verifies that error behavior is stable when used in long computations. We implemented randomized benchmarking on trapped atomic ion qubits, establishing a one-qubit error probability per randomized / 2 pulse of 0.00482͑17͒ in a particular experiment. We expect this error probability to be readily improved with straightforward technical modifications.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.