A direct implementation of supervised topic modeling using a Naive Bayes classifier is mainly characterized by the formulation of robust generative topic models that utilize prior distributions such as Dirichlet in LDA (latent Dirichlet allocation), where the classification ultimately follows the Bayes theorem. Though, in large scale applications, SVM (support vector machine) seems to outperform Naive Bayes. In this paper, we propose a classification framework that combines the flexibility of the generative topic models and the strong performance of the SVM. We therefore present a generative-discriminative collapsed variational Bayes technique for text documents and visual classification. Our collapsed variational Bayes topic model implements simultaneously two different and asymmetric conjugate priors within the same generative process as it specifically draws the document and corpus parameters using both GD (generalized Dirichlet) and BL (Beta-Liouville) distributions. Each of these flexible priors generalizes the Dirichlet in LDA. The proposed hybrid model results in a much improved inference that contributes to more accurate estimates, coherent (topic) generative features, a robust formulation of probabilistic kernels, and a much improved classification rate. Experiments in image and text documents classification show the merits of the proposed approach.