In this paper, we address the challenges of random label ordering and limited interpretability associated with Ensemble Classifier Chains (ECC) by introducing a novel ECC method, ECC-MOO&BN, which integrates Bayesian Networks (BN) and Multi-Objective Optimization (MOO). This approach is designed to concurrently overcome these ECC limitations. The ECC-MOO&BN method focuses on extracting diverse and interpretable label orderings for the ECC classifier. We initiated this process by employing mutual information to investigate label relationships and establish the initial structures of the BN. Subsequently, an enhanced NSGA-II algorithm was applied to develop a series of Directed Acyclic Graphs (DAGs) that effectively balance the likelihood and complexity of the BN structure. The rationale behind using the MOO method lies in its ability to optimize both complexity and likelihood simultaneously, which not only diversifies DAG generation but also helps avoid overfitting during the production of label orderings. The DAGs, once sorted topologically, yielded a series of label orderings, which were then seamlessly integrated into the ECC framework for addressing multi-label classification (MLC) problems. Experimental results show that when benchmarked against eleven leading-edge MLC algorithms, our proposed method achieves the highest average ranking across seven evaluation criteria on nine out of thirteen MLC datasets. The results of the Friedman test and Nemenyi test also indicate that the performance of the proposed method has a significant advantage compared to other algorithms.