Abstract-In this paper, a theoretical link between mixture subclass discriminant analysis (MSDA) and restricted Gaussian model is first presented, and then two further discriminant analysis (DA) methods, fractional step MSDA (FSMSDA) and kernel MSDA (KMSDA) are proposed. Linking MSDA to an appropriate Gaussian model allows the derivation of a new DA method under the Expectation Maximization (EM) framework (EM-MSDA), that derives simultaneously the discriminant subspace as well as the maximum likelihood estimates. The two other proposed methods generalize MSDA in order to solve problems inherited from conventional discriminant analysis. FSMSDA solves the subclass separation problem, that is, the situation when the dimensionality of the discriminant subspace is strictly smaller than the rank of the inter-between-subclass scatter matrix. This is done by an appropriate weighting scheme and the utilization of an iterative algorithm for preserving useful discriminant directions. On the other hand, KMSDA uses the kernel trick to separate data with nonlinearly separable subclass structure. Extensive experimentation shows that the proposed methods outperform conventional MSDA and other LDA variants.