The increasing sensitivity of gravitational-wave detectors has brought about an increase in the rate of astrophysical signal detections as well as the rate of "glitches"; transient and non-Gaussian detector noise. Temporal overlap of signals and glitches in the detector presents a challenge for inference analyses that typically assume the presence of only Gaussian detector noise. In this study we perform an extensive exploration of the efficacy of a recently proposed method that models the glitch with sine-Gaussian wavelets while simultaneously modeling the signal with compact-binary waveform templates. We explore a wide range of glitch families and signal morphologies and demonstrate that the joint modeling of glitches and signals (with wavelets and templates respectively) can reliably separate the two. We find that the glitches that most affect parameter estimation are also the glitches that are well modeled by such wavelets due to their compact time-frequency signature. As a further test, we investigate the robustness of this analysis against waveform systematics like those arising from the exclusion of higher-order modes and spin-precession effects. Our analysis provides an estimate of the signal parameters; the glitch waveform to be subtracted from the data and an assessment of whether some detected excess power consists of a glitch, signal, or both. We analyze the low-significance triggers (191225_215715 and 200114_020818) and find that they are both consistent with glitches overlapping high-mass signals.