Vitamins are essential compounds in living systems; they differ in their chemical structure and physiological action. Analytical methods have been developed for vitamin identification and/or quantification, using a wide variety of strategies. Thus, the food and pharmaceutical industries have taken advantages of these reliable methods and used them for the estimation of vitamins from simple to complex matrices. Mixtures of vitamins such as vitamin B complex 1,2 and multivitamins in tables and other pharmaceutical formulations are used in the treatment of several diseases.Therefore, the simultaneous determination of mixtures of vitamins is very useful in the pharmaceutical industry. Several methodologies have been developed for the determination of vitamins in different samples, such as liquid chromatography, 3-5 high performance liquid chromatography, 6 micellar electrokinetic capillary chromatography 7 and spectrophotometry and derivative methods. 8,9 The greatest difficulties with simultaneous spectrophotometric determination methods arise when the analytes to be determined give partly or fully overlapped spectra, as is the case with the ingredients of most pharmaceutical preparations. Nowadays, the combination of chemometrics methods with the computer controlled instruments to monitor the molecular absorption spectra creates powerful methods in multicomponent analysis. 10 Partial least squares regression (PLSR) is the most commonly used multivariate calibration method. It is based on linear models and is used as a satisfactory solution in most cases where a linear relationship is present between the spectra and the property to be determined (concentration, for example). 10 However, PLSR is not always the best option, especially in situations where a nonlinear model is clearly required. Theory and application of PLS have been discussed by several workers. [10][11][12][13][14][15][16][17][18] A support vector machine (SVM) is an algorithm from the machine learning community, developed by Cortes and coworker.19 Due to its remarkable generalization performance, SVM has attracted attention and gained extensive application in pattern recognition and regression problems. 20 SVM maps input data into a high dimensional feature space where it may become linearly separable by a hyperplane. One reason that SVM often performs betters than other methods is that SVM was designed to minimize structural risk; such a design has been shown to be superior to the traditional empirical risk minimization principle employed by conventional neural networks.Especially, Suykens and coworker 21,22 proposed a modified version of SVM called least-squares SVM (LS-SVM), which resulted in a set of linear equations instead of a quadratic programming problem, which can extend the applications of the SVM. There exist a number of excellent introductions of SVM. [23][24][25][26][27][28][29][30][31][32][33][34] The theory of LS-SVM has also been described clearly by Suykens et al. 21,22 Applications of LS-SVM in quantification and classification ha...