Thermal detectors are a cornerstone of infrared (IR) and terahertz (THz) technology due to their broad spectral range. These detectors call for suitable broad spectral absorbers with minimal thermal mass. Often this is realized by plasmonic absorbers, which ensure a high absorptivity but only for a narrow spectral band [1][2][3]. Alternativly, a common approach is based on impedancematching the sheet resistance of a thin metallic film to half the free-space impedance. Thereby, it is possible to achieve a wavelength-independent absorptivity of up to 50 %, depending on the dielectric properties of the underlying substrate [4][5][6]. However, existing absorber films typically require a thickness of the order of tens of nanometers, such as titanium nitride (14 nm) [7], which can significantly deteriorate the response of a thermal transducers. Here, we present the application of ultrathin gold (2 nm) on top of a 1.2 nm copper oxide seed layer as an effective IR absorber. An almost wavelength-independent and long-time stable absorptivity of ∼47(3) %, ranging from 2 µm to 20 µm, could be obtained and is further discussed. The presented gold thin-film represents an almost ideal impedance-matched IR absorber that allows a significant improvement of state-of-theart thermal detector technology.