Bayesian network (BN) with latent variables (LVs) provides a concise and straightforward framework for representing and inferring uncertain knowledge with unobservable variables or with regard to missing data. To learn the BN with LVs consistently with the realistic situations, we propose the information theory based concept of existence weight and incorporate it into the clique-based learning method. In line with the challenges when learning BN with LVs, we focus on determining the number of LVs, and determining the relationships between LVs and the observed variables. First, we define the existence weight and propose the algorithms for finding the ε-cliques from the BN without LVs learned from data. Then, we introduce the LV to each ε-clique and adjust the BN structure with LVs. Further, we adjust the value of parameter ε to determine the number of LVs. Experimental results show the feasibility of our method.