High performance infrared (IR) sensing and imaging systems require IR optoelectronic detectors that have a high signalto-noise ratio (SNR) and a fast response time, and that can be readily hybridised to CMOS read-out integrated circuits (ROICs). From a device point of view, this translates to p-n junction photovoltaic detectors based on narrow bandgap semiconductors with a high quantum efficiency (signal) and low dark current (noise). These requirements limit the choice of possible semiconductors to those having an appropriate bandgap that matches the wavelength band of interest combined with a high optical absorption coefficient and a long minority carrier diffusion length, which corresponds to a large mobility-lifetime product for photogenerated minority carriers. Technological constraints and modern clean-room fabrication processes necessitate that IR detector technologies are generally based on thin-film narrow bandgap semiconductors that have been epitaxially grown on lattice-matched wider bandgap IR-transparent substrates. The basic semiconductor material properties have led to InGaAs (in the SWIR up to 1.7 microns), InSb (in the MWIR up to 5 microns), and HgCdTe (in the eSWIR, MWIR and LWIR wavelength bands) being the dominant IR detector technologies for high performance applications. In this paper, the current technological limitations of HgCdTe-based technologies will be discussed with a view towards developing future pathways for the development of next-generation IR imaging arrays having the features of larger imaging array format and smaller pixel pitch, higher pixel yield and operability, higher quantum efficiency (QE), higher operating temperature (HOT), and dramatically lower per-unit cost.