2007
DOI: 10.1016/j.ijhcs.2006.10.001
|View full text |Cite
|
Sign up to set email alerts
|

Making adaptive cruise control (ACC) limits visible

Abstract: Previous studies have shown adaptive cruise control (ACC) can compromise driving safety when drivers do not understand how the ACC functions, suggesting that drivers need to be informed about the capabilities of this technology. This study applies ecological interface design (EID) to create a visual representation of ACC behavior, which is intended to promote appropriate reliance and support effective transitions between manual and ACC control. The EID display reveals the behavior of ACC in terms of time headw… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

4
107
2

Year Published

2009
2009
2022
2022

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 203 publications
(113 citation statements)
references
References 38 publications
4
107
2
Order By: Relevance
“…Hergeth et al (2016) also reported a negative correlation between participant's trust in automation and the extent of monitoring of a highly automated driving system during the engagement with an NDRT. Accordingly, better-calibrated trust, achieved by the display of system confidence or reliability (Dzindolet, Peterson, Pomranky, Pierce, & Beck, 2003;McGuirl & Sarter, 2006), led to faster braking responses in a study by Seppelt and Lee (2007). Beller et al (2013) showed that the presentation of information on an automated system's uncertainty improves situation awareness, improves a driver's mental model of the automated driving system, increases trust, and leads to an increased time to collision in the event of an automation failure.…”
Section: The Role Of Trust In Automated Drivingmentioning
confidence: 99%
See 1 more Smart Citation
“…Hergeth et al (2016) also reported a negative correlation between participant's trust in automation and the extent of monitoring of a highly automated driving system during the engagement with an NDRT. Accordingly, better-calibrated trust, achieved by the display of system confidence or reliability (Dzindolet, Peterson, Pomranky, Pierce, & Beck, 2003;McGuirl & Sarter, 2006), led to faster braking responses in a study by Seppelt and Lee (2007). Beller et al (2013) showed that the presentation of information on an automated system's uncertainty improves situation awareness, improves a driver's mental model of the automated driving system, increases trust, and leads to an increased time to collision in the event of an automation failure.…”
Section: The Role Of Trust In Automated Drivingmentioning
confidence: 99%
“…Beyond this, a calibration of trust by visualizing the automated driving system's confidence could continue to ensure appropriate reliance in long-term use. This information may be provided by a color-coded visual scale, anthropomorphic symbols or emoticons in either the instrument cluster or a head-up display (Beller et al, 2013;Helldin et al, 2013;Seppelt & Lee, 2007). A human-machine interface (HMI) could also by request give a verbal or visual explanation of a system limit directly after its occurrence to promote the understanding of the automated driving system's intentions, limits, and actions.…”
Section: Practical Application and Future Workmentioning
confidence: 99%
“…This could be mitigated by providing drivers with interfaces that display contextual information about the reliability of the systems, as it has been shown that complacency becomes less likely when information about the reliability of the instructions is provided [13]. Various methods have shown promising results for reducing complacency, such as displaying dynamically the system's confidence in its recommendations [10] or making the automation failure more salient [14]. Other human factor issues, such as the effects the availability of the system at some crossings and not others, while outside the scope of this study, should be considered if such device was to be implemented.…”
Section: Discussionmentioning
confidence: 99%
“…High specificity of trust would mean that the driver is aware of this change in context and the accompanying reduced reliability of the system and (temporarily) adjusts his or her level of trust according to this limited system performance. The importance of transparent system design for making functional system limits visible for the driver was stressed in a number of works (Goodrich & Boer, 2003;Nilsson, 2005; RESPONSE 3 Code of practice for the design and evaluation of ADAS," 2006; Seppelt & Lee, 2007;Simon, 2005). Goodrich and Boer (2003) propagate a "model-based human-centred task automation" that requires the identification of driver's mental models for specific driving subtasks in order to use them as a template for automation: "If the limits of automation correspond to the limits of a subset of natural operator skills, then the limits of the automation are most likely to be perceived and detected by the operator" (p. 329).…”
Section: Trust In Automationmentioning
confidence: 99%