1999
DOI: 10.1016/s0953-5438(98)00044-7
|View full text |Cite
|
Sign up to set email alerts
|

Toward a model of unreliability to study error prevention supports

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0
2

Year Published

2011
2011
2024
2024

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 41 publications
(8 citation statements)
references
References 16 publications
0
6
0
2
Order By: Relevance
“…This aims at reducing the occurrence or the impact of a dissonance. For instance, the control of overloaded situations reduces the occurrence of human errors when tasks are dynamically shared between human and machine (Vanderhaegen 1999c). Knowledge stability relates to sustainable knowledge equilibrium and any deviation from this stability generates dissonances, or is generated by the occurrence of a dissonance or by the impact of its control.…”
Section: Dissonance Control and Knowledge Reinforcementmentioning
confidence: 99%
“…This aims at reducing the occurrence or the impact of a dissonance. For instance, the control of overloaded situations reduces the occurrence of human errors when tasks are dynamically shared between human and machine (Vanderhaegen 1999c). Knowledge stability relates to sustainable knowledge equilibrium and any deviation from this stability generates dissonances, or is generated by the occurrence of a dissonance or by the impact of its control.…”
Section: Dissonance Control and Knowledge Reinforcementmentioning
confidence: 99%
“…Nevertheless, before introducing an automated system in operational settings, one has to determine sensitive scenarios of unsafe situations, as both humans and machines can be unreliable. Feasibility studies in ATC should at least assess the impact on three elements: operator's workload, safety criteria and perceived unreliability (Vanderhaegen, 1999).…”
Section: Sitmentioning
confidence: 99%
“…This control can be solved by applying the principles of Human-Machine cooperation which consist in dynamically sharing tasks between the human operator and an automated system. Tasks are shared at each stage of a decision from the detection to the action (Vanderhaegen, 1999a;Vanderhaegen, 1999b), during a specific task, such as the diagnostic task (Vanderhaegen et al 2004), an air traffic control task (Vanderhaegen, 1999c;Vanderhaegen, 1999d), a knowledge based conflict detection task Vanderhaegen, 2016;Polet et al, 2002;Pacaux-Lemoine et al, 2013), or a driving task (Sentouh et al, 2009;Sentouh et al, 2006;Sentouh et al, 2013;Soualmi eet al, 2014;. The adaptability of the human operator makes it a complex system but essential to the optimal functioning of a Human-Machine System.…”
Section: Identification Of Task Shared (Step 1)mentioning
confidence: 99%