2015
DOI: 10.1016/j.jtcvs.2014.10.058
|View full text |Cite
|
Sign up to set email alerts
|

National Aeronautics and Space Administration “threat and error” model applied to pediatric cardiac surgery: Error cycles precede ∼85% of patient deaths

Abstract: Human error, if not mitigated, often leads to cycles of error and unintended patient states, which are dangerous and precede the majority of harmful outcomes. Efforts to manage threats and error cycles (through crew resource management techniques) are likely to yield large increases in patient safety.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
30
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 20 publications
(30 citation statements)
references
References 15 publications
0
30
0
Order By: Relevance
“…Aviation-safety investigators have operationalized Reason's (1990) "Swiss cheese" accident-causation model (Fig. 2) in recognition that failure is inevitable in complex systems (Helmreich & Merritt 2000;Hickey et al 2015); failure rarely has a single cause (Reason 1995); human error is implicated in the majority of accidents involving technological systems (Duffey & Saull 2003); and attributions of blame drive errors underground and limit learning opportunities (Reason 2000). For example, accepting that human error may compromise all systems, commercial airline flight-crew training intentionally does not set goals for zero errors, but rather zero accidents, an achievable goal through the active process of mitigating human error when it occurs (United Airlines 2016) and one that shifts blame away from individuals.…”
Section: Definitions Of Failurementioning
confidence: 99%
“…Aviation-safety investigators have operationalized Reason's (1990) "Swiss cheese" accident-causation model (Fig. 2) in recognition that failure is inevitable in complex systems (Helmreich & Merritt 2000;Hickey et al 2015); failure rarely has a single cause (Reason 1995); human error is implicated in the majority of accidents involving technological systems (Duffey & Saull 2003); and attributions of blame drive errors underground and limit learning opportunities (Reason 2000). For example, accepting that human error may compromise all systems, commercial airline flight-crew training intentionally does not set goals for zero errors, but rather zero accidents, an achievable goal through the active process of mitigating human error when it occurs (United Airlines 2016) and one that shifts blame away from individuals.…”
Section: Definitions Of Failurementioning
confidence: 99%
“…The aviation industry holds Six Sigma (nearly perfect) safety records, because it uses the system approach, deals with errors non‐punitively yet proactively, and reduces the consequences of error before escalation. This way of reporting and managing error results in a ‘just culture’, where aviation professionals feel confident to report events (even their own mistakes), by promoting balanced accountability for individuals and organizations responsible.…”
Section: Resultsmentioning
confidence: 99%
“…5 The principles of safety training in aviation are based on a "threats and errors" model which summarizes the interplay of factors that could potentially lead to a catastrophe. 6 According to this model, errors are preceded by a threat, which is any factor that causes deviation from normal conditions. In this case, the SVC cannula was placed outside the surgical field, which was an adaptation for the minimally invasive procedure and different from the routine for conventional cardiac surgical cases.…”
Section: Discussionmentioning
confidence: 99%