The problem of handling a class imbalanced problem by modifying decision tree algorithm has received widespread attention in recent years. A new splitting measure, called class overlapping-balancing entropy (OBE), is introduced in this paper that essentially pay attentions to all classes equally. Each step, the proportion of each class is balanced via the assigned weighted values. They not only depend on equalizing each class, but they also take into account the overlapping region between classes. The proportion of weighted values corresponding to each class is used as the component of Shannon's entropy for splitting the current dataset. From the experimental results, OBE significantly outperforms the conventional splitting measures like Gini index, gain ratio and DCSM, which are used in the well-known decision tree algorithms. It also exhibits superior performance compared to AE and ME that are designed for handling the class imbalanced problem specifically.