In white light absorption spectroscopy, the broadening of the absorption signal due to the apparatus profile of the spectrometer may lead to an underestimation of the determined density as one measures an apparent optical depth. This is in particular true for high optical depth where saturation effects of the transmitted intensity occur. Provided that the line profile of the absorption line is known, the apparent optical depth effect can be accounted for by introducing a correction factor. The impact of the saturation and the approach of considering the effect are demonstrated for argon and indium lines in low pressure plasmas where correction factors of one order of magnitude or even higher are reached very easily. For the indium line, the hyperfine splitting has been taken into account. In laser absorption, the line profile is resolved. However, the weak but rather broad background emission of the laser diode can cause a saturation signal at the photo diode resulting also in an underestimation of the density obtained from the analysis. It is shown that this can be taken into account by fitting the theoretical line profile to the measured absorption signal which yields also a correction factor. The method is introduced and demonstrated at the example of the cesium resonance line including the hyperfine splitting. Typical correction factors around two are obtained for the cesium ground state density at conditions of a low pressure negative hydrogen ion source in which cesium is evaporated to enhance the negative ion production.