Federated Learning (FL), as a privacy-preserving machine learning paradigm, has been thrusted into the limelight. As a result of the physical bandwidth constraint, only a small number of clients are selected for each round of FL training. However, existing client selection solutions (e.g., the vanilla random selection) typically ignore the heterogeneous data value of the clients. In this paper, we propose the contribution-based selection algorithm (Contribution-Based Exponentialweight algorithm for Exploration and Exploitation, CBE3), which dynamically updates the selection weights according to the impact of clients' data. As a novel component of CBE3, a scaling factor, which helps maintain a good balance between global model accuracy and convergence speed, is proposed to improve the algorithm's adaptability. Theoretically, we proved the regret bound of the proposed CBE3 algorithm, which demonstrates performance gaps between the CBE3 and the optimal choice. Empirically, extensive experiments conducted on Non-Independent Identically Distributed data demonstrate the superior performance of CBE3with up to 10% accuracy improvement compared with K-Center and Greedy and up to 100% faster convergence compared with the Random algorithm.