In recent years, the Fisher linear discriminant analysis (FLDA) based classification models are among the most successful approaches and have shown effective performance in different classification tasks. However, when the learning data (source domain) have a different distribution compared with the testing data (target domain), the FLDA-based models may not work well, and the performance degrades, dramatically. To face this issue, we offer an optimal domain adaptation via Bregman divergence minimization (DAB) approach, in which the discriminative features of source and target domains are simultaneously learned via domain invariant representation. DAB is designed based on the constraints of FLDA, with the aim of the coupled marginal and conditional distribution shifts adaptation through Bregman divergence minimization. Thus, the resulting representation can show well functionality like FLDA and simultaneously discriminate across various classes, as well. Moreover, our proposed approach can be easily kernelized to deal with nonlinear tasks. Different experiments on various benchmark datasets demonstrate that our DAB can constructively face with the cross domain divergence and outperforms other novel state-of-the-art domain adaptation approaches in crossdistribution domains.