Abstract:Temperature fluctuations can have a significant impact on the repeatability of spectral measurements and as a consequence can adversely affect the resulting calibration model. More specifically, when test samples measured at temperatures unseen in the training data set are presented to the model, degraded predictive performance can materialize. Current methods for addressing the temperature variations in a calibration model can be categorized into two classescalibration model based approaches, and spectra standardization methodologies. This paper presents a comparative study on a number of strategies reported in the literature including partial least squares (PLS), continuous piecewise direct standardization (CPDS) and loading space standardization (LSS), in terms of the practical applicability of the algorithms, their implementation complexity, and their predictive performance. It was observed from the study that the global modelling approach, where latent variables are initially extracted from the spectra using PLS, and then augmented with temperature as the independent variable, achieved the best predictive performance. In addition, the two spectra standardization methods, CPDS and LSS, did not provide consistently enhanced performance over the conventional global modelling approach, despite the additional effort in terms of standardizing the spectra across different temperatures. Considering the algorithmic complexity and resulting calibration accuracy, it is concluded that the global modelling (with temperature) approach should be first considered for the development of a calibration model where temperature variations are known to affect the fundamental data, prior to investigating the more powerful spectra standardization approaches.