Reliably measuring the similarity of two shapes or images (instances) is an important problem for various computer vision applications such as classification, recognition, and retrieval. While pairwise measures take advantage of the geometric differences between two instances to quantify their similarity, recent advances use relationships among the population of instances when quantifying pairwise measures. In this paper, we propose a novel method which refines pairwise similarity measures using population cues by examining the most similar instances shared by the compared shapes or images. We then use this refined measure to organize instances into disjoint components that consist of similar instances. Connectivity is then established between components to avoid hard constraints on what instances can be retrieved, improving retrieval performance. To evaluate the proposed method we conduct experiments on the wellknown MPEG-7 and Swedish Leaf shape datasets as well as the Nister and Stewenius image dataset. We show that the proposed method is versatile, performing very well on its own or in concert with existing methods.