In clinical settings, the necessity of treatment is often measured in terms of the patient’s prognosis in the absence of treatment. Along these lines, it is often of interest to compare subgroups of patients (e.g., based on underlying diagnosis) with respect to pre-treatment survival. Such comparisons may be complicated by at least two important issues. First, mortality contrasts by subgroup may differ over follow-up time, as opposed to being constant, and may follow a form that is difficult to model parametrically. Moreover, in settings where the proportional hazards assumption fails, investigators tend to be more interested in cumulative (as opposed to instantaneous) effects on mortality. Second, pre-treatment death is censored by the receipt of treatment and in settings where treatment assignment depends on time-dependent factors that also affect mortality, such censoring is likely to be informative. We propose semiparametric methods for contrasting subgroup-specific cumulative mortality in the presence of dependent censoring. The proposed estimators are based on the cumulative hazard function, with pre-treatment mortality assumed to follow a stratified Cox model. No functional form is assumed for the nature of the non-proportionality. Asymptotic properties of the proposed estimators are derived, and simulation studies show that the proposed methods are applicable to practical sample sizes. The methods are then applied to contrast pre-transplant mortality for acute versus chronic End-Stage Liver Disease patients.