2016
DOI: 10.1177/0018720816682648
|View full text |Cite
|
Sign up to set email alerts
|

Trust and the Compliance–Reliance Paradigm: The Effects of Risk, Error Bias, and Reliability on Trust and Dependence

Abstract: This research could be used to update training and design recommendations that are based upon the assumption that trust causes operator responses regardless of error bias.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

5
73
1

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
4

Relationship

1
9

Authors

Journals

citations
Cited by 98 publications
(79 citation statements)
references
References 45 publications
5
73
1
Order By: Relevance
“…Risk can be defined as "the extent to which there is uncertainty about whether potentially significant and/or disappointing outcomes of decisions will be realized" (Sitkin & Pablo, 1992, p. 10). As with other preceding works (Chancey et al, 2017;Lyons et al, 2011;Mayer et al, 1995), our research also showed that increased perceived risk elevates automation trust, especially on the performance dimension (Sato et al, 2020). Sato et al (2020) asked 40 nonexpert undergraduate participants to concurrently perform the tracking task and the system-monitoring task with the assistance of an imperfect automated system.…”
Section: Development Of Trust Insupporting
confidence: 89%
“…Risk can be defined as "the extent to which there is uncertainty about whether potentially significant and/or disappointing outcomes of decisions will be realized" (Sitkin & Pablo, 1992, p. 10). As with other preceding works (Chancey et al, 2017;Lyons et al, 2011;Mayer et al, 1995), our research also showed that increased perceived risk elevates automation trust, especially on the performance dimension (Sato et al, 2020). Sato et al (2020) asked 40 nonexpert undergraduate participants to concurrently perform the tracking task and the system-monitoring task with the assistance of an imperfect automated system.…”
Section: Development Of Trust Insupporting
confidence: 89%
“…The frequent experience of false alarms can decrease the operators’ trust in alarm systems (e.g., Lee and See, 2004; Madhavan et al, 2006; Rice, 2009). This can lead to the disuse of the system in terms of longer reaction times and a decrease in the tendency to respond to alarms (e.g., Bliss et al, 1995; Getty et al, 1995; Dixon et al, 2007; Chancey et al, 2017) – what has been referred to as the cry wolf phenomenon (Breznitz, 1984). Also the opposite effect can be observed.…”
Section: Introductionmentioning
confidence: 99%
“…Trust, however, is not an exclusive determinant of usage, with other factors such as task load [9], [13] frequently found to affect usage independent of trust. Even when trust and dependence on automation are affected by the same factors, mediation may be absent as in the case of alarms [15] for which rates of agreement can be predicted from probability matching without reference to trust. We hypothesize that trust will become a determinant of dependence primarily in situations where information for predicting automation behavior is incomplete.…”
Section: A Trust In Automationmentioning
confidence: 99%