Footwear examiners are tasked with comparing an outsole impression (Q) left at a crime scene with an impression (K) from a database or from the suspect's shoe. We propose a method for comparing two shoe outsole impressions that relies on robust features (speeded‐up robust feature; SURF) on each impression and aligns them using a maximum clique (MC). After alignment, an algorithm we denote MC‐COMP is used to extract additional features that are then combined into a univariate similarity score using a random forest (RF). We use a database of shoe outsole impressions that includes images from two models of athletic shoes that were purchased new and then worn by study participants for about 6 months. The shoes share class characteristics such as outsole pattern and size, and thus the comparison is challenging. We find that the RF implemented on SURF outperforms other methods recently proposed in the literature in terms of classification precision. In more realistic scenarios where crime scene impressions may be degraded and smudged, the algorithm we propose—denoted MC‐COMP‐SURF—shows the best classification performance by detecting unique features better than other methods. The algorithm can be implemented with the R‐package shoeprintr.
We propose a novel method to quantify the similarity between an impression (Q) from an unknown source and a test impression (K) from a known source. Using the property of geometrical congruence in the impressions, the degree of correspondence is quantified using ideas from graph theory and maximum clique (MC). The algorithm uses the x and y coordinates of the edges in the images as the data. We focus on local areas in Q and the corresponding regions in K and extract features for comparison. Using pairs of images with known origin, we train a random forest to classify pairs into mates and non-mates. We collected impressions from 60 pairs of shoes of the same brand and model, worn over six months. Using a different set of very similar shoes, we evaluated the performance of the algorithm in terms of the accuracy with which it correctly classified images into source classes. Using classification error rates and ROC curves, we compare the proposed method to other algorithms in the literature and show that for these data, our method shows good classification performance relative to other methods. The algorithm can be implemented with the R package shoeprintr.
In forensic practice, evaluating shoeprint evidence is challenging because the differences between images of two different outsoles can be subtle. In this paper, we propose a deep transfer learning‐based matching algorithm called the Shoe‐MS algorithm that quantifies the similarity between two outsole images. The Shoe‐MS algorithm consists of a Siamese neural network for two input images followed by a transfer learning component to extract features from outsole impression images. The added layers are finely tuned using images of shoe soles. To test the performance of the method we propose, we use a study dataset that is both realistic and challenging. The pairs of images for which we know ground truth include (1) close non‐matches and (2) mock‐crime scene pairs. The Shoe‐MS algorithm performed well in terms of prediction accuracy and was able to determine the source of pairs of outsole images, even when comparisons were challenging. When using a score‐based likelihood ratio, the algorithm made the correct decision with high probability in a test of the hypothesis that images had a common source. An important advantage of the proposed approach is that pairs of images can be compared without alignment. In initial tests, Shoe‐MS exhibited better‐discriminating power than existing methods.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.