It is well known that Raman intensities are affected by the spectrometer slit width. Wider slit widths provide increased Raman intensity at the cost of decreased spectral wavenumber resolution. However, the relationship between the spectrometer slit width and observed Raman intensities is poorly understood. In this work, we develop a model, based on spectrometer geometric optics, to quantify the effects of spectrometer slit width on observed Raman intensities. We find that the observed Raman intensities are strongly dependent on the ratio of the slit function width and intrinsic Raman band width.Thus, even for Raman spectra collected using the same slit width, the observed Raman intensities will be affected by the different intrinsic bandwidths of each Raman band. Thus, the effects of the slit function on Raman intensities must be considered for all Raman spectra when examining Raman intensities. In addition, we show that the spectrometer linear dispersion and charge-coupled device (CCD) pixel size have a large dependence on the Raman intensities reported by the CCD. Thus, when comparing Raman intensities across instruments or for a single instrument with a nonlinear wavenumber calibration, Raman intensities must be calibrated for the effects of the CCD. Using our model, we develop a general method for correcting observed Raman intensities for the effects of the slit width, intrinsic Raman band width, and CCD.