Distortion allocation varying with wavelength in lossy compression of hyperspectral imagery is investigated, with the aim of minimizing the spectral distortion between original and decompressed data. The absolute angular error, or spectral angle mapper (SAM), is used to quantify spectral distortion, while radiometric distortions are measured by maximum absolute deviation (MAD) for near-lossless methods, for example, differential pulse code modulation (DPCM), or mean-squared error (MSE) for lossy methods, for example, spectral decorrelation followed by JPEG 2000. Two strategies of interband distortion allocation are compared: given a target average bit rate, distortion may be set to be constant with wavelength. Otherwise, it may be allocated proportionally to the noise level of each band, according to the virtually lossless protocol. Comparisons with the uncompressed originals show that the average SAM of radiance spectra is minimized by constant distortion allocation to radiance data. However, variable distortion allocation according to the virtually lossless protocol yields significantly lower SAM in case of reflectance spectra obtained from compressed radiance data, if compared with the constant distortion allocation at the same compression ratio.