Given the stochastic nature of gene expression, genetically identical cells exposed to the same environmental inputs will produce different outputs. This heterogeneity has been hypothesized to have consequences for how cells are able to survive in changing environments. Recent work has explored the use of information theory as a framework to understand the accuracy with which cells can ascertain the state of their surroundings. Yet the predictive power of these approaches is limited and has not been rigorously tested using precision measurements. To that end, we generate a minimal model for a simple genetic circuit in which all parameter values for the model come from independently published data sets. We then predict the information processing capacity of the genetic circuit for a suite of biophysical parameters such as protein copy number and protein-DNA affinity. We compare these parameter-free predictions with an experimental determination of protein expression distributions and the resulting information processing capacity of E. coli cells. We find that our minimal model captures the scaling of the cell-to-cell variability in the data and the inferred information processing capacity of our simple genetic circuit up to a systematic deviation.
9Given the stochastic nature of gene expression, genetically identical cells exposed to the same 10 environmental inputs will produce di↵erent outputs. This heterogeneity has consequences for how 11 cells are able to survive in changing environments. Recent work has explored the use of information 12 theory as a framework to understand the accuracy with which cells can ascertain the state of their 13 surroundings. Yet the predictive power of these approaches is limited and has not been rigorously 14 tested using precision measurements. To that end, we generate a minimal model for a simple genetic 15 circuit in which all parameter values for the model come from independently published data sets. 16 We then predict the information processing capacity of the genetic circuit for a suite of biophysical 17 parameters such as protein copy number and protein-DNA a nity. We compare these parameter-18 free predictions with an experimental determination of the information processing capacity of E. 19 coli cells, and find that our minimal model accurately captures the experimental data. 20 As living organisms thrive in some given environment, they are faced with constant changes in their 21 surroundings. From abiotic conditions such as temperature fluctuations or changes in osmotic pressure, 22 to biological interactions such as cell-to-cell communication in a tissue or in a bacterial biofilm, living 23 organisms of all types sense and respond to external signals. Fig. 1(A) shows a schematic of this 24 process for a bacterial cell sensing a concentration of an extracellular chemical. At the molecular 25 level where signal transduction unfolds mechanistically, there are physical constraints on the accuracy 26 and precision of these responses given by intrinsic stochastic fluctuations [1]. This means that two 27 genetically identical cells exposed to the same stimulus will not have an identical response [2]. 28The implication of this biological noise is that cells do not have an infinite resolution to distinguish 29 signals and, as a consequence, there is a one-to-many mapping between inputs and outputs. Further-30 more, given the limited number of possible outputs, there are overlapping responses between di↵erent 31 inputs. In that sense, one might think of cells performing a Bayesian inference of the state of the 32 environment given their phenotypic response, as schematized in Fig. 1(B). The question then becomes 33 how to analyze this probabilistic rather than deterministic relationship between inputs and outputs? 34The abstract answer to this question was worked out in 1948 by Claude Shannon who, in his seminal 35 work, founded the field of information theory [3]. Shannon developed a general framework for how 36 to analyze information transmission through noisy communication channels. In his work, Shannon 37 showed that the only quantity that satisfies simple conditions of how a metric for information should 38 behave, was of the same functional form as the thermodynamic entropy -thereby christening his ...
In 1943, McCulloch and Pitts introduced a discrete recurrent neural network as a model for computation in brains. The work inspired breakthroughs such as the first computer design and the theory of finite automata. We focus on learning in Hopfield networks, a special case with symmetric weights and fixed-point attractor dynamics. Specifically, we explore minimum energy flow (MEF) as a scalable convex objective for determining network parameters. We catalog various properties of MEF, such as biological plausibility, and then compare to classical approaches in the theory of learning. Trained Hopfield networks can perform unsupervised clustering and define novel error-correcting coding schemes. They also efficiently find hidden structures (cliques) in graph theory. We extend this known connection from graphs to hypergraphs and discover n-node networks with robust storage of 2Ω(n1−ϵ) memories for any ϵ>0. In the case of graphs, we also determine a critical ratio of training samples at which networks generalize completely.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.