We propose a physically-realizable information-driven device consisting of an enzyme in a chemical bath, interacting with pairs of molecules prepared in correlated states. These correlations persist without direct interaction and thus store free energy equal to the mutual information. The enzyme can harness this free energy, and that stored in the individual molecular states, to do chemical work. Alternatively, the enzyme can use the chemical driving to create mutual information. A modified system can function without external intervention, approaching biological systems more closely.Organisms exploit correlations in their environment to survive and grow. This fact holds across scales, from bacterial chemotaxis, which leverages the spatial clustering of food molecules [1,2], to the loss of leaves by deciduous trees, which is worthwhile because sunlight exposure is highly correlated from day to day. Evolution itself relies on correlations across time and space, otherwise a mutation which is beneficial would immediately lose its utility and selection would be impossible.Biological systems also generate correlations. In particular, information transmission is an exercise in correlating input and output [3], and recent years have thus seen information theory applied to biological systems involved in sensing [4][5][6], signalling [7,8], chemotaxis [1, 2], adaption [9, 10] and beyond [11]. In the language of information theory, correlated variables X and Y have a positive mutual informationp(x)p(y) (measured in nats), with p(x, y) the joint probability of a given state and p(x), p(y) the marginals. The "information entropy" H(Y ) = − y∈Y p(y) ln p(y) quantifies Y 's uncertainty, and I(X; Y ) is the reduction in this entropy given knowledge of X: I(X; Y ) = H(Y ) − H(Y |X). The mutual information is symmetric, non-negative, and zero if and only if X and Y are statistically independent.Information theory is also deeply connected to thermodynamics [12][13][14][15][16][17][18][19][20]. Sagawa and Ueda [14], building on [12,13], showed that measurement cycles have a minimal work cost equal to the mutual information generated between data and memory. Horowitz and Esposito showed that entropy production within a system X can be negative if X is coupled to a second system Y , and transitions in X decrease I(X; Y ) [18]. A third key result, essential to exorcising Maxwell's Demon [6,21], is that if X and Y are uncoupled from each other, yet coupled to heat baths at temperature T , then the total free energy is [22,23] HereF (X) = F eq (X) − kT x∈X p(x) ln(p eq (x)/p(x)) is the non-equilibrium free energy [20,23], with the tilde indicating the generalisation from the standard equilibrium free energy F eq (X). Systems X and Y could be two non-interacting spins, or two physically separated molecules. For uncoupled systems, the partition function is separable and X and Y are independent in equilibrium, I eq (X; Y ) = 0. However, correlations induced by coupling between X and Y at earlier times could persist even after the coupling...