The effect that the target's temperature has on the characteristics of an ablation plasma as it was cooled from room temperature down to ~ 20 K was investigated. The target was attached to the cold finger of a helium refrigerator, both operating in a vacuum environment for thermal insulation. The plasma was produced by focusing a nanosecond high-power laser onto the sample. A spectrometer and a monochromator were used to monitor both the line and the continuum emissions, with and without time resolution. It was found that, starting at room temperature, the intensity of the lines emitted steadily increased as the target was cooled down to 150-175 K and afterward remained level as the target was further cooled. Using the Boltzmann plot technique the excitation temperature was obtained and it was found that its mean value gradually increased as the target was cooled. The calculated electronic density, on the contrary, decreased with decreasing target temperature. Another effect noted was that the intensity of the continuum also varied with target temperature. We found that the continuum´s distribution does not match a free-free or a recombination source, but it does fit a blackbody one. By fitting Planck's formula to the continuum spectra the blackbody temperature was extracted and it was found to be within the 9400-13,300 K interval. By solving the heat-flow equation before plasma onset we demonstrate that the laser beam heats the sample's surface at a faster rate the cooler it is and that this peculiar behavior is the reason why the excitation temperature increases at cryogenic temperatures. This behavior is a consequence of the fact that the sample's specific heat varies by two orders of magnitude as it is cooled from room temperature to ~ 20 K.