Abstract:D. Davis and W. C. Follette (2002) purport to show that when "the base rate" for a crime is low, the probative value of "characteristics known to be strongly associated with the crime . . . will be virtually nil." Their analysis rests on the choice of an arbitrary and inapposite measure of the probative value of evidence. When a more suitable metric is used (e.g., a likelihood ratio), it becomes clear that evidence they would dismiss as devoid of probative value is relevant and diagnostic.
“…Also, ratio distributions are asymmetric with one side of the distribution stretching from 1 to infinity, whereas the other side is compressed between 0 and 1. A standard solution to these problems is to use the log of the C/F ratio rather than the C/F ratio itself (Kaye, 1986). For a correct identification rate of .4 and a false identification rate of .1, the log of the C/F ratio is 1.39.…”
Section: Probative Value Of Eyewitness Identification Evidencementioning
Psychological science has come to play an increasingly important role in the legal system by informing the court through expert testimony and by shaping public policy. In recent years, psychological research has driven a movement to reform the procedures that police use to obtain eyewitness identification evidence. This reform movement has been based in part on an argument suggesting that recommended procedures reduce the risk of false identifications with little or no reduction in the rate of correct identifications. A review of the empirical literature, however, challenges this no-cost view. With only one exception, changes in eyewitness identification procedures that reduce the risk of false identification of the innocent also reduce the likelihood of correct identification of the guilty. The implication that criminals may escape prosecution as a result of procedures implemented to protect the innocent makes policy decisions far more complicated than they would otherwise be under the no-cost view. These costs (correct identifications lost) and benefits (false identifications avoided) are discussed in terms of probative value and expected utility.
“…Also, ratio distributions are asymmetric with one side of the distribution stretching from 1 to infinity, whereas the other side is compressed between 0 and 1. A standard solution to these problems is to use the log of the C/F ratio rather than the C/F ratio itself (Kaye, 1986). For a correct identification rate of .4 and a false identification rate of .1, the log of the C/F ratio is 1.39.…”
Section: Probative Value Of Eyewitness Identification Evidencementioning
Psychological science has come to play an increasingly important role in the legal system by informing the court through expert testimony and by shaping public policy. In recent years, psychological research has driven a movement to reform the procedures that police use to obtain eyewitness identification evidence. This reform movement has been based in part on an argument suggesting that recommended procedures reduce the risk of false identifications with little or no reduction in the rate of correct identifications. A review of the empirical literature, however, challenges this no-cost view. With only one exception, changes in eyewitness identification procedures that reduce the risk of false identification of the innocent also reduce the likelihood of correct identification of the guilty. The implication that criminals may escape prosecution as a result of procedures implemented to protect the innocent makes policy decisions far more complicated than they would otherwise be under the no-cost view. These costs (correct identifications lost) and benefits (false identifications avoided) are discussed in terms of probative value and expected utility.
“…For example, if the correct identification rate is .6 and the false identification rate is .1, the ratio is 6, meaning that a correct identification is 6 times more likely than a false identification (assuming equal numbers of guilty-suspect and innocent-suspect lineups). Despite its longevity as the accuracy measure of choice, researchers and legal scholars have long been aware of its statistical and interpretive limitations (Kaye, 1986). These problems have been most clearly articulated in the recent work of Wixted and Mickes (2012); Gronlund, Wixted, and Mickes (2014);and Clark (2012).…”
Section: Accuracy Of Eyewitness Identification Proceduresmentioning
This article addresses the problem of eyewitness identification errors that can lead to false convictions of the innocent and false acquittals of the guilty. At the heart of our analysis based on signal detection theory is the separation of diagnostic accuracy—the ability to discriminate between those who are guilty versus those who are innocent—from the consideration of the relative costs associated with different kinds of errors. Application of this theory suggests that current recommendations for reforms have conflated diagnostic accuracy with the evaluation of costs in such a way as to reduce the accuracy of identification evidence and the accuracy of adjudicative outcomes. Our framework points to a revision in recommended procedures and a framework for policy analysis.
“…NRC studies of forensic science issues have consistently advocated a likelihood ratio interpretation of relevance 12,28 . In this explicitly probabilistic framework, the amount of support that a piece of scientific evidence lends to a hypothesis in question is quantifiable in terms of the probabilities that the evidence would be observed if the hypothesis were true or false 29,30 . Let H be the hypothesis in question, and E be the evidence.…”
Section: Relevance and Admissibility Of Scientific Evidencementioning
The validation of the probative value of microbial forensic techniques is a critical aspect of research and development that requires careful planning and a sound conceptual framework. This paper outlines a particular approach to the validation of certain types of forensic methods that naturally generates statistical measures of the relevance and weight of the scientific evidence derived from them. The suggested approach is based on the likelihood ratio interpretation of Federal Rules of Evidence 401 and 402 and allows measurement evidence to be to be presented in a format that resembles the forensic "gold standard" -human DNA typing. Examples of specific genetic and chemical and physical analysis methods are used to illustrate how this general strategy can be applied. This approach also provides a natural interpretation of the notion of "preliminary validation" that has been proposed in the literature.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.