Minimal cost feature selection plays a crucial role in cost-sensitive learning. It aims to determine a feature subset for minimizing the average total cost by considering the trade-off between test costs and misclassification costs. Recently, a backtracking algorithm has been developed to tackle this problem. Unfortunately, the efficiency of the algorithm for large datasets is often unacceptable. Moreover, the run time of this algorithm significantly increases with the rise of misclassification costs. In this paper, we develop an exponent weighted algorithm for minimal cost feature selection, and the exponent weighted function of feature significance is constructed to increase the efficiency of the algorithm. The exponent weighted function is based on the information entropy, test cost, and a user-specified non-positive exponent. The effectiveness of our algorithm is demonstrated on six UCI datasets with two representative test cost distributions. Compared with the existing backtracking algorithm, the proposed algorithm significantly increases efficiency without being influenced by the misclassification cost setting.