The majority of star-forming galaxies follow a simple empirical correlation in the star formation rate (SFR) versus stellar mass (M * ) plane, of the form M SFR * µ a , usually referred to as the star formation main sequence (MS). The physics that sets the properties of the MS is currently a subject of debate, and no consensus has been reached regarding the fundamental difference between members of the sequence and its outliers. Here we combine a set of hydro-dynamical simulations of interacting galactic disks with state-of-the-art radiative transfer codes to analyze how the evolution of mergers is reflected upon the properties of the MS. We present CHIBURST, a Markov Chain Monte Carlo spectral energy distribution (SED) code that fits the multi-wavelength, broad-band photometry of galaxies and derives stellar masses, SFRs, and geometrical properties of the dust distribution. We apply this tool to the SEDs of simulated mergers and compare the derived results with the reference output from the simulations. Our results indicate that changes in the SEDs of mergers as they approach coalescence and depart from the MS are related to an evolution of dust geometry in scales larger than a few hundred parsecs. This is reflected in a correlation between the specific star formation rate, and the compactness parameter , that parametrizes this geometry and hence the evolution of dust temperature (T dust ) with time. As mergers approach coalescence, they depart from the MS and increase their compactness, which implies that moderate outliers of the MS are consistent with late-type mergers. By further applying our method to real observations of luminous infrared galaxies (LIRGs), we show that the merger scenario is unable to explain these extreme outliers of the MS. Only by significantly increasing the gas fraction in the simulations are we able to reproduce the SEDs of LIRGs.