The activity of a massive star approaching core-collapse can strongly affect the appearance of the star and its subsequent supernova. Late-phase convective nuclear burning generates waves that propagate toward the stellar surface, heating the envelope and potentially triggering mass loss. In this work, we improve on previous onedimensional models by performing two-dimensional simulations of the pre-supernova mass ejection phase due to deposition of wave energy. Beginning with stellar evolutionary models of a 15 M e red supergiant star during core O-burning, we treat the rate and duration of energy deposition as model parameters and examine the mass-loss dependence and the pre-explosion morphology accordingly. Unlike one-dimensional models, density inversions due to wave heating are smoothed by Rayleigh-Taylor instabilities, and the primary effect of wave heating is to radially expand the star's hydrogen envelope. For low heating rates with long durations, the expansion is nearly homologous, whereas high but short-lived heating can generate a shock that drives envelope expansion and results in a qualitatively different density profile at the time of core-collapse. Asymmetries are fairly small, and large amounts of mass loss are unlikely unless the wave heating exceeds expectations. We discuss implications for presupernova stellar variability and supernovae light curves.