Findings that decision makers can come to different conclusions depending on the order in which they receive information have been termed the "information order bias." When trained, experienced individuals exhibit similar behaviors; however, it has been argued that this result is not a bias, but rather, a pattern-matching process. This study provides a critical examination of this claim. It also assesses both experts' susceptibility to an outcome framing bias and the effects of varying task loads on judgment. Using a simulation of state-of-the-art ship defensive systems operated by experienced, active-duty U.S. Navy officers, we found no evidence of a framing bias, while task load had a minor, but systematic effect. The order in which information was received had a significant impact, with the effect being consistent with a judgment bias. Nonetheless, we note that pattern-matching processes, similar to those that produce inferential and reconstructive effects on memory, could also explain our results. Actual or potential applications of this research include decision support system interfaces or training programs that might be developed to reduce judgment bias.
This presentation summarizes the results of an empirical study examining human judgment bias under conditions of uncertainty and time pressure in surface Anti-Air Warfare (AAW). A substantial body of research has demonstrated that humans apply a limited set of heuristics to simplify decision making in complex and ambiguous situations. Most of this research, however, has used college students making logical, but unfamiliar judgments. This study was designed to assess whether Naval personnel, trained and experienced in AAW operations, exhibit these biases when performing their normal duties. Specifically, we studied whether the judgments of Naval tactical action officers in a realistic task simulation exhibit characteristics of the biases of availability, representativeness, anchoring-contrast, and confirmation. Our prediction that experienced subjects would disregard lack of reliability in otherwise representative data was only partially supported by the study. On the other hand, each of our other predictions was strongly supported. Our subjects ignored baseline trends when other case-specific information was available (representativeness and availability). They were significantly influenced by the order they received evidence, showing a recency effect characteristic of contrast. Additionally, as is characteristic of confirmation bias, they recalled much more of the information that was consistent with their final hypothesis and evaluated it as more informative than the inconsistent data, regardless of which hypothesis they had adopted. Implications for Naval decision support systems information and display are discussed.
Howard and Bray found that a group of "volunteers" obtained higher student ratings than "nonvolunteers." They defined "volunteers" as instructors who voluntarily continued to use a student rating form after a first, mandatory use. This study analyzed Instructional Development and Effectiveness Assessment (IDEA) student rating data from 13,063 classes from several academic fields and institutions. Classes were divided into three groups: volunteers, in which the decision to evaluate was entirely the instructor's; intermediate, in which the evaluation was required, but the instructor chose the class; and nonvolunteers, in which the instructor was required to have the class evaluated. Statistically significant differences among the three groups were found for 26 of the 39 IDEA items (probably because of the very large Ns). Based upon o> 2 analyses, none of these differences were of practical significance; none of the a; 2 values accounted for even 1% of the variance. It was concluded that voluntariness of evaluation need not be taken into consideration when using large, multi-institutional, comparative student rating data pools.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.