The early development of quantitative electron probe microanalysis, first using crystal spectrometers, then energy dispersive x-ray spectrometers (EDXS), demonstrated that elements could be detected at 0.001 mass fraction level and major concentrations measured within 2 % relative uncertainty. However, during this period of extensive investigation and evaluation, EDXS detectors were not able to detect x rays below 1 keV and all quantitative analysis was performed using a set of reference standards measured on the instrument. Now that EDXS systems are often used without standards and are increasingly being used to analyse elements using lines well below 1 keV, accuracy can be considerably worse than is documented in standard textbooks. Spectrum processing techniques found most applicable to EDXS have now been integrated into total system solutions and can give excellent results on selected samples. However, the same techniques fail in some applications because of a variety of instrumental effects. Prediction of peak shape, width and position for every characteristic line and measurement of background intensity is complicated by variations in response from system to system and with changing count rate. However, with an understanding of the fundamental sources of error, even a total system can be tested like a “black box” in areas where it is most likely to fail and thus establish the degree of confidence that should apply in the intended application. This approach is particularly important when the microanalysis technique is applied at lower electron beam voltages where the extraction of line intensities is complicated by extreme peak overlap and higher background levels.