Partial multi-label learning (PML), which tackles the problem where each training instance is associated with multiple candidate labels which only a subset are valid. In this paper, we propose a simple but effective batch-wise PML model, PML-SE, which tackles PML problem with a self-ensemble approach (SE), namely the ensembles of model and predictions. Specially, PML-SE introduces a teacher model to refine a more reliable soft label matrix of each training batch by iteratively ensembling the current learned prediction network with the formal one in an online manner. Besides, it adopts a MixUp data augmentation scheme to enhance the robustness of the prediction network against the redundant irrelevant labels. In addition, we form self-ensemble label predictions through a consistency cost to boost the performance of the prediction network. Extensive experiments are conducted on synthesized and real-world PML datasets, while the proposed approach demonstrates the state-of-the-art performance for partial multi-label learning.