Detecting subsurface delamination is a difficult and vital task to maintain the durability and serviceability of concrete structure for its whole life cycle. The aim of this work was to obtain better knowledge of the effect of depth, heating time, and rebar on the detectability capacity of delamination. Experimental tests were carried out on a concrete specimen in the laboratory using Long Pulsed Thermography (LPT). Six halogen lamps and a long wavelength infrared camera with a focal plane array of 640 × 480 pixels were used as the heat source and infrared detector, respectively. The study focused on the embedded imitation delaminations with the size of 10 cm × 10 cm × 1 cm, located at depths varying from 1 to 8 cm. The signal-to-noise ratio (SNR) was applied as a criterion to assess the detectability of delamination. The results of this study indicate that as the provided heating time climbed, the SNR increased, and the defect could be identified more clearly. On the other hand, when using the same heating regime, a shallow delamination displayed a higher SNR than a deeper one. The moderate fall of the SNR in the case of imitating defect located below reinforced steel was also observed. The absolute contrast was monitored to determine the observation time, and the nondimensional prefactor k was empirically proposed to predict the depth of delamination. The mean absolute percentage error (MAPE) was used to quantitatively evaluate the difference between forecasted and real depth, which evaluation confirmed the high reliability of the estimated value of the prefactor k.