Although organizations appear to learn from obvious failures, we argue that it is harder for them to learn from "near-misses"--events in which chance played a role in averting failure. In this paper, we formalize the concept of near-misses and hypothesize that organizations and managers fail to learn from near-misses because they evaluate such events as successes and thus feel safer about the situation. We distinguish perceived ("felt") risk from calculated statistical risk and propose that lower levels of perceived risk encourage people with near-miss information to make riskier subsequent decisions compared to people without near-miss information. In our first study, we confirm the tendency to evaluate near-misses as successes by having participants rate a project manager whose decisions result in either (a) mission success, (b) near-miss, or (c) failure. Participants (both students and NASA employees and contractors) give similar ratings to managers whose decisions produced near-misses and to managers whose decisions resulted in successes, and both ratings are significantly different from ratings of managers who experienced failures. We suggest that the failure to hold managers accountable for near-misses is a foregone learning opportunity for both the manager and the organization. In our second set of studies, we confirm that near-miss information leads people to choose a riskier alternative because of a lower perceived risk following near-miss events. We explore several alternative explanations for these findings, including the role of Bayesian updating in processing near-miss data. Ultimately, the analysis suggests that managers and organizations are reducing their perception of the risk, although not necessarily updating (lowering) the statistical probability of the failure event. We speculate that this divergence arises because perceived risk is the product of associative processing, whereas statistical risk arises from rule-based processing.risk, inference, decision making
I n the aftermath of many natural and man-made disasters, people often wonder why those affected were underprepared, especially when the disaster was the result of known or regularly occurring hazards (e.g., hurricanes). We study one contributing factor: prior near-miss experiences. Near misses are events that have some nontrivial expectation of ending in disaster but, by chance, do not. We demonstrate that when near misses are interpreted as disasters that did not occur, people illegitimately underestimate the danger of subsequent hazardous situations and make riskier decisions (e.g., choosing not to engage in mitigation activities for the potential hazard). On the other hand, if near misses can be recognized and interpreted as disasters that almost happened, this will counter the basic "near-miss" effect and encourage more mitigation. We illustrate the robustness of this pattern across populations with varying levels of real expertise with hazards and different hazard contexts (household evacuation for a hurricane, Caribbean cruises during hurricane season, and deep-water oil drilling). We conclude with ideas to help people manage and communicate about risk.
Prior research shows that when people perceive the risk of some hazardous event to be low, they are unlikely to engage in mitigation activities for the potential hazard. We believe one factor that can lower inappropriately (from a normative perspective) people's perception of the risk of a hazard is information about prior near-miss events. A near-miss occurs when an event (such as a hurricane), which had some nontrivial probability of ending in disaster (loss of life, property damage), does not because good fortune intervenes. People appear to mistake such good fortune as an indicator of resiliency. In our first study, people with near-miss information were less likely to purchase flood insurance, and this was shown for both participants from the general population and individuals with specific interests in risk and natural disasters. In our second study, we consider a different mitigation decision, that is, to evacuate from a hurricane, and vary the level of statistical probability of hurricane damage. We still found a strong effect for near-miss information. Our research thus shows how people who have experienced a similar situation but escape damage because of chance will make decisions consistent with a perception that the situation is less risky than those without the past experience. We end by discussing the implications for risk communication.
This article describes the anti-terrorism risk-based decision aid (ARDA), a risk-based decision-making approach for prioritizing anti-terrorism measures. The ARDA model was developed as part of a larger effort to assess investments for protecting U.S. Navy assets at risk and determine whether the most effective anti-terrorism alternatives are being used to reduce the risk to the facilities and war-fighting assets. With ARDA and some support from subject matter experts, we examine thousands of scenarios composed of 15 attack modes against 160 facility types on two installations and hundreds of portfolios of 22 mitigation alternatives. ARDA uses multiattribute utility theory to solve some of the commonly identified challenges in security risk analysis. This article describes the process and documents lessons learned from applying the ARDA model for this application.
Disasters garner attention when they occur, and organizations commonly extract valuable lessons from visible failures, adopting new behaviors in response. For example, the United States saw numerous security policy changes following the September 11 terrorist attacks and emergency management and shelter policy changes following Hurricane Katrina. But what about those events that occur that fall short of disaster? Research that examines prior hazard experience shows that this experience can be a mixed blessing. Prior experience can stimulate protective measures, but sometimes prior experience can deceive people into feeling an unwarranted sense of safety. This research focuses on how people interpret near-miss experiences. We demonstrate that when near-misses are interpreted as disasters that did not occur and thus provide the perception that the system is resilient to the hazard, people illegitimately underestimate the danger of subsequent hazardous situations and make riskier decisions. On the other hand, if near-misses can be recognized and interpreted as disasters that almost happened and thus provide the perception that the system is vulnerable to the hazard, this will counter the basic "near-miss" effect and encourage mitigation. In this article, we use these distinctions between resilient and vulnerable near-misses to examine how people come to define an event as either a resilient or vulnerable near-miss, as well as how this interpretation influences their perceptions of risk and their future preparedness behavior. Our contribution is in highlighting the critical role that people's interpretation of the prior experience has on their subsequent behavior and in measuring what shapes this interpretation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.