The two most important atmospheric transmission bands in the infrared occur at 3-5 µm and 8-12 µm respectively. For a given infrared detector a common question that continues to be asked is, of the two spectral bands, which, if any, gives the better performance? While seemly an innocent enough question, the literature attests it has not been without controversy. Conflicting and often contradictory results have been given that tend to reflect the predilections of the proponent rather than any sort of measured consideration based on technical factors likely to affect performance. In this study an analysis designed to assess the relative merits of infrared detectors operating in the 3-5 µm and 8-12 µm spectral bands based on the recently defined figure of merit known as the detected thermal contrast is undertaken. The detected thermal contrast attempts to describe the overall performance of the sequence of events from the initial emission of thermal radiation at the surface of a target to the final measurable output signal seen in the detecting instrument. Under ideal limiting conditions typical of those found for many industrial and scientific applications, by considering targets whose spectral emissivities vary as a function of both wavelength and temperature, exact expressions based on the recently introduced polylogarithmic formulation of the problem are developed for both thermal and quantum detectors. It is found the 3-5 µm waveband for either detector type gives better performance while differences between the two types of detectors is not as significant as one might initially expect. The work not only extends upon a number of approximate schemes that have been proposed and developed in the past where target emissivities as a function of the temperature have been used, it also challenges a number of previously reported results.