Kernel PCA; Spectral clustering; Support vector machine.This work outlines a unified formulation to represent spectral approaches for both dimensionality reduction and clustering. Proposed formulation starts with a generic latent variable model in terms of the projected input data matrix. Particularly, such a projection maps data onto a unknown high-dimensional space. Regarding this model, a generalized optimization problem is stated using quadratic formulations and a least-squares support vector machine. The solution of the optimization is addressed through a primal-dual scheme. Once latent variables and parameters are determined, the resultant model outputs a versatile projected matrix able to represent data in a low-dimensional space, as well as to provide information about clusters. Particularly, proposed formulation yields solutions for kernel spectral clustering and weighted-kernel principal component analysis.On one side, kernel methods are of interest since they allow to incorporate prior knowledge into the clustering procedure [Filippone et al., 2008]. In case of unsupervised clustering methods (that is to say, when clusters are naturally formed by following a given partition criterion), a set of initial parameters should be properly selected to avoid any local optimum solution distant from the desired global optimum. Indeed, in spectral clustering (SC), such initial parameters are traditionally the number of clusters and the input kernel matrix itself. On the other side, the aim of dimensionality reduction (DR) is to extract a lower dimensional, relevant information from high-dim ensional data, being then a key stage for the design of pattern recognition systems. Indeed, when using adequate DR stages, the system performance can be enhanced as well as the data visualization can become more intelligible . Recent methods of DR are focused on the data topology preservation [Peluffo-Ordóñez et al., 2014b]. Mostly such a topology is driven by graph-based approaches where data are represented by a similarity matrix, and it is then susceptible to be expressed in terms of a kernel matrix [Ham et al., 2004], which means that a wide range of methods can be set within a kernel principal component analysis (KPCA) framework [Peluffo-Ordonez et al., 2014]. At the moment to choose a method for either SC or DR, aspects such as nature of data, complexity, aim to be reached and problem to be solved should be taken into consideration. In this regard, it must be quoted that there exists a variety of spectral methods making then the selection of a method a nontrivial task. In fact, some problems may require the combination of methods so that the properties of different methods are simultaneously exploited [Peluffo-Ordónez et al., 2015]. Some works have studied the benefit of taking advantage simultaneously of DR and SC techniques. For instance, in [Peluffo-Ordóñez et al., 2014a], a DR approach (linear feature extraction) is used to enhance the clustering performance by performing the grouping process over the projected data ...