2000
DOI: 10.1080/14639220052399159
|View full text |Cite
|
Sign up to set email alerts
|

Attention and complacency

Abstract: The problem of complacency is analysed, and it is shown that previous research that claims to show its existence is defective, because the existence of complacency can not be proved unless optimal behaviour is speci® ed as a benchmark. Using gedanken experiments, it is further shown that, in general, not even with optimal monitoring can all signals be detected. Complacency is concerned with attention (monitoring, sampling), not with detection, and there is little evidence for complacent behaviour. To claim tha… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
90
0
1

Year Published

2007
2007
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 110 publications
(94 citation statements)
references
References 8 publications
3
90
0
1
Order By: Relevance
“…At the same time, errors in the automation can lead to an erosion of trust [26]. Excessive trust, meanwhile, can lead to insufficient monitoring and control of the automation ("overtrust" or "complacency" [27]). The majority of studies on the subject to date have focused on the reciprocal effects of trust in the use of Adaptive Cruise Control (ACC).…”
Section: Automation In the Carmentioning
confidence: 99%
“…At the same time, errors in the automation can lead to an erosion of trust [26]. Excessive trust, meanwhile, can lead to insufficient monitoring and control of the automation ("overtrust" or "complacency" [27]). The majority of studies on the subject to date have focused on the reciprocal effects of trust in the use of Adaptive Cruise Control (ACC).…”
Section: Automation In the Carmentioning
confidence: 99%
“…Overreliance or complacency is created as operators form beliefs of the technical system as being more competent than it actually is. This overreliance on automation represents an important aspect of misuse that can result from several forms of human error, including decision biases and failure of monitoring [35][36][37]. A typical illustration is the case of the crash of Northwest Airlines at Detroit Airport in 1987.…”
Section: Over-trust / Complacencymentioning
confidence: 99%
“…This leads to reduced awareness of what the system is doing and increases complacency. Moray & Inagaki (2000) showed that in highly reliable automated systems, it actually is a sensible and suitable strategy for operators to not maintain constant SA, meaning that the operator's behavior is well calibrated to the reliability of the system.…”
Section: Endsley's Sa Model (Shown Inmentioning
confidence: 99%