Information processing typically occurs via the composition of modular units, such as the universal logic gates found in discrete computation circuits. The benefit of modular information processing, in contrast to globally integrated information processing, is that complex computations are more easily and flexibly implemented via a series of simpler, localized information processing operations that only control and change local degrees of freedom. We show that, despite these benefits, there are unavoidable thermodynamic costs to modularity-costs that arise directly from the operation of localized processing and that go beyond Landauer's bound on the work required to erase information. Localized operations are unable to leverage global correlations, which are a thermodynamic fuel. We quantify the minimum irretrievable dissipation of modular computations in terms of the difference between the change in global nonequilibrium free energy, which captures these global correlations, and the local (marginal) change in nonequilibrium free energy, which bounds modular work production. This modularity dissipation is proportional to the amount of additional work required to perform a computational task modularly, measuring a structural energy cost. It determines the thermodynamic efficiency of different modular implementations of the same computation, and so it has immediate consequences for the architecture of physically embedded transducers, known as information ratchets. Constructively, we show how to circumvent modularity dissipation by designing internal ratchet states that capture the information reservoir's global correlations and patterns. Thus, there are routes to thermodynamic efficiency that circumvent globally integrated protocols and instead reduce modularity dissipation to optimize the architecture of computations composed of a series of localized operations.