“…But because we were constrained by the choices of Duez et al's authors and the information they made publicly available-their sample selection, the demographic information about participants they provided, and their decision to release only photos of comparisons items-we could not manipulate and control these variables. Such research, however, is critical given that existing validation studies show an, at best, imperfect correlation between training/experience and superior performance: none have successfully distinguished poor performing/misidentification-prone examiners from their more accurate peers based on their source of training, years of practice, or the accreditation status of their employing laboratories[14,62,106]. Thus, while we encourage the field of firearms examination to explore the role of expertise, as other fields such as latent print comparison[44][45][46][47], document examination[50,107], and facial recognition[48,49] have attempted to do, our study does not close that gap in existing literature.Beyond conducting studies designed to assess the role of expertise (if any) in the comparison of bullets and cartridge cases, our research suggests a serious need to (1) design studies going forward that include challenging, close non-match comparisons including those with subclass characteristics, and (2) reevaluate the difficulty and complexity of samples from existing validation efforts.…”