This article analyzes a kernel-based transfer learning method, under a k-class Gaussian mixture model for the input data. Following recent advances in random matrix theory, we propose new insights in transfer learning schemes for challenging cases, when the first-order statistics of all data classes coincide. The article proves the asymptotic normality of the LS-SVM decision function for any smooth kernel function. As a result, an optimization scheme is proposed to minimize the classification error rate. Our theoretical results are corroborated through simulations and then successfully applied to the context of transfer learning for PolSAR image classification.