Many organisms capitalize on their ability to predict the environment to maximize available free energy, and reinvest this energy to create new complex structures. This functionality relies on the manipulation of patterns-temporally ordered sequences of data. Here, we propose a framework to describe pattern manipulators -devices that convert thermodynamic work to patterns or vice versa -and use them to build a 'pattern engine that facilitates a thermodynamic cycle of pattern creation and consumption. We show that the least heat dissipation is achieved by the provably simplest devices; the ones that exhibit desired operational behaviour while maintaining the least internal memory. We derive the ultimate limits of this heat dissipation, and show that it is generally non-zero and connected with the patterns intrinsic crypticity -a complexity theoretic quantity that captures the puzzling difference between the amount of information the pattern's past behaviour reveals about its future, and the amount one needs to communicate about this past to optimally predict the future.The manipulation of patterns is as important to living organisms as it is for computation. Living things capitalize on structure in their environment for available energy, and use this energy to generate new complex structures. Similarly, a crucial task in the modern era of big data is to identify patterns in large data sets in order to make predictions about future events -often at great energetic cost. Here, we consider the thermodynamic costs intrinsic to this sort of pattern manipulation and ask: is there a preferred method by which this manipulation should be done? Our intuition is that simpler is better, a longstanding tenant of natural philosophy known as Occam's razor. To formalize this, we first qualify what is meant both by simpler and by better.In complexity science, computational mechanics formalizes what is simpler in the context of pattern manipulation [1][2][3]. The premise is that everything we observe in the environment can be considered to be a patterna temporal sequence of data exhibiting certain statistical structure. Much of science then deals with building models that can explain such statistics -machines that take information from past observations, and use it to generate statistically coinciding conditional future predictions. Given two machines that exhibit same pattern of behaviour, the one that stores less information from the past is considered simpler, the motivation being that it better isolates indicators of future behaviour. The simplest such machine then defines exactly how much memory is required to produce a given pattern, and thus * ajpgarner@nus.edu.sg † cqtmileg@nus.edu.sg quantifies the pattern's intrinsic structure. Known as statistical complexity, this measure has been applied to quantify structure in diverse contexts [4][5][6].Meanwhile in thermodynamics, better originally described heat engines that produce more work with less wasted heat. This carries through to modern thermodynamics: the best approach fo...