The causal structure of a stochastic process can be more efficiently transmitted via a quantum channel than a classical one, an advantage that increases with codeword length. While previously difficult to compute, we express the quantum advantage in closed form using spectral decomposition, leading to direct computation of the quantum communication cost at all encoding lengths, including infinite. This makes clear how finite-codeword compression is controlled by the classical process' cryptic order and allows us to analyze structure within the length-asymptotic regime of infinitecryptic order (and infinite Markov order) processes.
We give exact formulae for a wide family of complexity measures that capture the organization of hidden nonlinear processes. The spectral decomposition of operator-valued functions leads to closedform expressions involving the full eigenvalue spectrum of the mixed-state presentation of a process's -machine causal-state dynamic. Measures include correlation functions, power spectra, past-future mutual information, transient and synchronization informations, and many others. As a result, a direct and complete analysis of intrinsic computation is now available for the temporal organization of finitary hidden Markov models and nonlinear dynamical systems with generating partitions and for the spatial organization in one-dimensional systems, including spin systems, cellular automata, and complex materials via chaotic crystallography.© 2016 Elsevier B.V. All rights reserved.The emergence of organization in physical, engineered, and social systems is a fascinating and now, after half a century of active research, widely appreciated phenomenon [1][2][3][4][5]. Success in extending the long list of instances of emergent organization, however, is not equivalent to understanding what organization itself is. How do we say objectively that new organization has appeared? How do we measure quantitatively how organized a system has become?Computational mechanics' answer to these questions is that a system's organization is captured in how it stores and processes information-how it computes [6]. Intrinsic computation was introduced two decades ago to analyze the inherent information processing in complex systems [7]: How much history does a system remember? In what architecture is that information stored? And, how does the system use it to generate future behavior?Computational mechanics, though, is part of a long historical trajectory focused on developing a physics of information [8][9][10]. That nonlinear systems actively process information goes back to Kolmogorov [11], who adapted Shannon's communication theory [12] to measure the information production rate of chaotic dynamical systems. In this spirit, today computational mechanics is routinely used to determine physical and intrinsic computational properties in single-molecule dynamics [13], in complex materials * Corresponding author.E-mail addresses: chaos@ucdavis.edu (J.P. Crutchfield), cellison@wisc.edu (C.J. Ellison), pmriechers@ucdavis.edu (P.M. Riechers).[14], and even in the formation of social structure [15], to mention several recent examples.Thus, measures of complexity are important to quantifying how organized nonlinear systems are: their randomness and their structure. Moreover, we now know that randomness and structure are intimately intertwined. One cannot be properly defined or even practically measured without the other [16, and references therein].Measuring complexity has been a challenge: Until recently, in understanding the varieties of organization to be captured; still practically, in terms of estimating metrics from experimental data. One major rea...
A central result that arose in applying information theory to the stochastic thermodynamics of nonlinear dynamical systems is the Information-Processing Second Law (IPSL): the physical entropy of the universe can decrease if compensated by the Shannon-Kolmogorov-Sinai entropy change of appropriate information-carrying degrees of freedom. In particular, the asymptotic-rate IPSL precisely delineates the thermodynamic functioning of autonomous Maxwellian demons and information engines. How do these systems begin to function as engines, Landauer erasers, and error correctors? Here, we identify a minimal, inescapable transient dissipation engendered by physical information processing not captured by asymptotic rates, but critical to adaptive thermodynamic processes such as found in biological systems. A component of transient dissipation, we also identify an implementation-dependent cost that varies from one physical substrate to another for the same information processing task. Applying these results to producing structured patterns from a structureless information reservoir, we show that "retrodictive" generators achieve the minimal costs. The results establish the thermodynamic toll imposed by a physical system's structure as it comes to optimally transduce information. Introduction. Classical thermodynamics and statistical mechanics appeal to various reservoirs-reservoirs of heat, work, particles, and chemical species-each characterized by unique, idealized thermodynamic properties. A heat reservoir, for example, corresponds to a physical system with a large specific heat and short equilibration time. A work reservoir accepts or gives up energy without a change in entropy. Arising naturally in recent analyses of Maxwellian demons and information engines [1][2][3][4][5][6][7][8][9][10][11][12][13][14][15][16][17], information reservoirs have come to play a central role as idealized physical systems that exchange information but not energy [18][19][20]. Their inclusion led rather directly to an extended Second Law of Thermodynamics for complex systems: The total physical (Clausius) entropy of the universe and the Shannon entropy of its information reservoirs cannot decrease in time [4,18,[21][22][23]. We refer to this generalization as the Information Processing Second Law (IPSL) [24].A specific realization of an information reservoir is a tape of symbols where information is encoded in the symbols' values [25]. To understand the role that information processing plays in the efficiencies and bounds on thermodynamic transformations the following device has been explored in detail: a "ratchet" slides along a tape and interacts with one symbol at a time in presence of heat and work reservoirs [26]. By increasing the tape's Shannon entropy, the ratchet can steadily transfer energy from the heat to the work reservoirs [4]. This violates the
Modern digital electronics support remarkably reliable computing, especially given the challenge of controlling nanoscale logical components that interact in fluctuating environments. However, we demonstrate that the high-reliability limit is subject to a fundamental error-energy-efficiency tradeoff that arises from time-symmetric control: requiring a low probability of error causes energy consumption to diverge as the logarithm of the inverse error rate for nonreciprocal logical transitions. The reciprocity (self-invertibility) of a computation is a stricter condition for thermodynamic efficiency than logical reversibility (invertibility), the latter being the root of Landauer's work bound on erasing information. Beyond engineered computation, the results identify a generic error-dissipation tradeoff in steady-state transformations of genetic information carried out by biological organisms. The lesson is that computational dissipation under time-symmetric control cannot reach, and is often far above, the Landauer limit. In this way, time-asymmetry becomes a design principle for thermodynamically efficient computing.
Framing computation as the transformation of metastable memories, we explore its fundamental thermodynamic limits. The true power of information follows from a novel decomposition of nonequilibrium free energy derived here, which provides a rigorous thermodynamic description of coarse-grained memory systems. In the nearly-quasistatic limit, logically irreversible operations can be performed with thermodynamic reversibility. Yet, here we show that beyond the reversible work Landauer's bound requires of computation, dissipation must be incurred both for modular computation and for neglected statistical structure among memory elements used in a computation. The general results are then applied to evaluate the thermodynamic costs of all two-input-one-output logic gates, including the universal NAND gate. Interwoven discussion clarifies the prospects for Maxwellian demons and information engines as well as opportunities for hyper-efficient computers of the future.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.