The Michaelis-Menten (MM) fundamental formula describes how the rate of enzyme catalysis depends on substrate concentration. The familiar hyperbolic relationship was derived by timescale separation for a network of three reactions. The same formula has subsequently been found to describe steady-state input-output responses in many biological contexts, including single-molecule enzyme kinetics, gene regulation, transcription, translation, and force generation. Previous attempts to explain its ubiquity have been limited to networks with regular structure or simplifying parametric assumptions. Here, we exploit the graph-based linear framework for timescale separation to derive general structural conditions under which the MM formula arises. The conditions require a partition of the graph into two parts, akin to a "coarse graining" into the original MM graph, and constraints on where and how the input variable occurs. Other features of the graph, including the numerical values of parameters, can remain arbitrary, thereby explaining the formula's ubiquity. For systems at thermodynamic equilibrium, we derive a necessary and sufficient condition. For systems away from thermodynamic equilibrium, especially those with irreversible reactions, distinct structural conditions arise and a general characterization remains open. Nevertheless, our results accommodate, in much greater generality, all examples known to us in the literature.
Interpreting partial information collected from systems subject to noise is a key problem across scientific disciplines. Theoretical frameworks often focus on the dynamics of variables that result from coarse-graining the internal states of a physical system. However, most experimental apparatuses are rather sensitive to specific transitions, while internal states of the physical system are blurred or inaccessible. Here we consider an observer who records a time series of occurrences of one or several transitions performed by a system, under the single assumption that its underlying dynamics is Markovian. We pose the question of how one can use the transitions' information to make inferences of dynamical, thermodynamical, and biochemical properties. First, putting forward first-passage time techniques, we derive analytical expressions for the probabilities of consecutive transitions and for the time elapsed between them, which we call inter-transition times. Second, we develop an estimate lower bound to the entropy production rate which can be split into two non-negative contributions, one due to the statistics of transitions and a second due to the statistics of inter-transition times. We also show that when only one current is measured, our estimate still detects irreversibility even in the absence of net currents in the transition time series. Third, we verify our results with numerical simulations of biophysical models of Kinesin, of Dynein, and of template-directed polymerization by molecular motors. Our numerical results reveal that while entropy production is entailed in the statistics of two successive transitions of the same type (i.e. repeated transitions), the statistics of two different successive transitions (i.e. alternated transitions) can probe the existence of an underlying disorder in the motion of a molecular motor.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.