Researchers contribute to the frontiers of knowledge by establishing facts and reaching new conclusions through systematic investigations, and by subsequently publishing the outcomes of their research findings in the form of research papers. These research publications are indicative of researchers' scientific impact. Different bibliometric indices have been proposed to measure the impact or productivity of a researcher. These indices include publication count, citation count, number of coauthors, h-index, etc. The h-index, since its inception, has been ranked as the foremost impact indicator by many studies. However, as a consequence of the various short comings identified in h-index, some variants of h-index have been proposed. For instance, one dimension which requires significant attention is determining the ability of exceptional performers in a particular research area. In our study, we have compared effectiveness of h-index and some of its recent variants in identifying the exceptional performers of a field. We have also found correlation of h-index with recently proposed indices. A high correlation indicates same effect of these indices as of h-index and low correlation means these indices make non-redundant contribution while ranking potential researchers of a field of study. So far, effectiveness of these indices has not been explored/validated on real data sets of same field. We have considered these variants/modifications of h-index along with h-index and tested on comprehensive data set for the field of Computer Science. The Award winners' data set is considered as the benchmark for the evaluation of these indices for individual researchers. Results show that there is a low correlation of these indices with h-index, and in identifying exceptional performers of a field these indices perform better than h-index.