In the last decade, data generated from different digital devices has posed a remarkable challenge for data representation and analysis. Because of the high-dimensional datasets and the rapid growth of data volume, a lot of challenges have been encountered in various fields such as data mining and data science. Conventional machine learning classifiers are of limited ability to handle the problems of high dimensionality that includes memory limitation, computational cost, and low accuracy performance. Consequently, there is a need to reduce the dimension of datasets by choosing the most significant features that would represent the data efficiently with minimum volume. This study proposes an improved binary version of the equilibrium optimizer algorithm (IBEO) to mitigate features selection problem. Two main enhancements are added to the original equilibrium optimizer (EO) to strengthen its performance. Opposition based learning is the first advancement added to the initialization stage of EO to enhance the diversity of the population in the search space. Local search algorithm is the second advancement added to enhance the exploitation of EO. Wrapper approaches can offer premium solutions. Thus, we used k-nearest neighbour classifier and support vector machine classifiers as the most popular wrapper methods. Moreover, dealing with the problem of over-fitting is an essential task that urges on applying k-fold cross-validation to split each dataset into training and testing data. Comparative tests with different well-known algorithms such as grey wolf optimization, grasshopper optimization, particle swarm optimization, whale optimization, dragonfly, and improved salp swarm algorithms are considered. The proposed algorithm is applied to the most commonly datasets used in the field to validate the performance. Statistical analysis studies demonstrate the effectiveness of the IBEO.