1984
DOI: 10.21236/ada149621
|View full text |Cite
|
Sign up to set email alerts
|

Research and Modeling of Supervisory Control Behavior. Report of a Workshop

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
52
0
2

Year Published

1990
1990
2021
2021

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 79 publications
(55 citation statements)
references
References 2 publications
(2 reference statements)
1
52
0
2
Order By: Relevance
“…The researchers found that operators checked the status of the automatic pump more frequently when it worked less reliable and when unpredictable (variable) errors occurred compared to constant ones. Moreover, Muir and Moray found a negative correlation between participants' monitoring behaviour and their ratings of trust in the automatic pump: The less they trusted it, the more intensely they monitored it and the more they trusted it, the less intensely they monitored it; supporting former predictions by Muir (1994) and Sheridan and Hennessy (1984).…”
Section: The Influence Of Automation Reliability On Trust and Reliancementioning
confidence: 74%
See 1 more Smart Citation
“…The researchers found that operators checked the status of the automatic pump more frequently when it worked less reliable and when unpredictable (variable) errors occurred compared to constant ones. Moreover, Muir and Moray found a negative correlation between participants' monitoring behaviour and their ratings of trust in the automatic pump: The less they trusted it, the more intensely they monitored it and the more they trusted it, the less intensely they monitored it; supporting former predictions by Muir (1994) and Sheridan and Hennessy (1984).…”
Section: The Influence Of Automation Reliability On Trust and Reliancementioning
confidence: 74%
“…On the other hand, relying on heuristics in the decision to rely on an automated system may not be appropriate in any situational context. The demand for well-calibrated trust that matches the true capabilities of a system gains its importance in this point (Lee & Moray, 1994;Lee & See, 2004;Muir, 1987Muir, , 1994Sheridan & Hennessy, 1984). Wellcalibrated trust is characterised by high resolution and high specificity (Cohen, Parasuraman, & Freeman, 1999;Lee & See, 2004).…”
Section: Trust In Automationmentioning
confidence: 99%
“…This role most often results from hierarchical automation designs in which the human user is most often put at the apex of the command pyramid. In a supervisory role, humans tend to be poorly integrated with the automated systems often resulting in their being 'out of the loop' [1][2][3][4]. When the human user is 'out of the loop', or not fully integrated into the system SA may become degraded.…”
Section: Introductionmentioning
confidence: 99%
“…This is because SA is a function of the human user's available attention and working memory with which to acquire and interpret environmental information [5] and when a user is 'out of the loop' attention may not be directed to the current task. Degraded SA leads to decreased awareness of automation mode or knowledge of task completion [6][7][8] and has been linked to poor interaction between humans and automated agent [1,2,7] Furthermore, this degradation can be exacerbated by excessively high or low workload [10]. Degraded SA can result in tragic consequences in high risk domains such as in military or nuclear power plant applications [5,8] Given the consequences of degraded SA, identification and mitigation of instances when it is degraded could be considered an important aim in improving joint human automation system performance.…”
Section: Introductionmentioning
confidence: 99%
“…Still others require a human hand (or mind) to generate the commitment needed to implement them (e.g., in military or sales campaigns). When many decisions are automated, one must still worry that the reduced role left to the operators' discretion will lead to deskilling or disengagement, reducing their ability to "get back in the loop" when distinctly human interventions are needed (Sheridan and Hennessey, 1985).…”
mentioning
confidence: 99%