2017
DOI: 10.1177/1541931213601560
|View full text |Cite
|
Sign up to set email alerts
|

Relationships between User Demographics and User Trust in an Autonomous Agent

Abstract: Reliability of autonomous agents has been shown to play a pivotal role in the human-agent team. This research investigates the relationship between demographic factors and trust using the application environment, Space Navigator. Using stepwise multiple linear regression, it was found that workload (NASA-TLX), gender, education level, and the reliability of the autonomous agent impact the perceived reliability or user trust in the system. When the user experienced higher workload, the user placed less trust in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
5
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(6 citation statements)
references
References 14 publications
1
5
0
Order By: Relevance
“…High complexity situations tended to result in lower levels of trust in the driver assist systems, while low complexity situations resulted in higher trust. This result is consistent with what would be expected behaviorally, as previous research as suggested that complex or high workload situations might negatively influence trust or reliance in automation (Hillesheim et al, 2017). It is expected that, had this same study been carried out within a driving simulator or on-road experiment, behavioral indicators of trust may have suggested similar levels of reliance and trust in the high complexity situations.…”
Section: Discussionsupporting
confidence: 90%
See 2 more Smart Citations
“…High complexity situations tended to result in lower levels of trust in the driver assist systems, while low complexity situations resulted in higher trust. This result is consistent with what would be expected behaviorally, as previous research as suggested that complex or high workload situations might negatively influence trust or reliance in automation (Hillesheim et al, 2017). It is expected that, had this same study been carried out within a driving simulator or on-road experiment, behavioral indicators of trust may have suggested similar levels of reliance and trust in the high complexity situations.…”
Section: Discussionsupporting
confidence: 90%
“…The complexity of the task and the expected workload involved will also influence how operators trust automated systems (Bailey & Scerbo, 2007;Hancock et al, 2011). Previous research has found that increases in workload will also negatively impact trust in automation (Hillesheim et al, 2017).…”
Section: Factors Impacting Trustmentioning
confidence: 99%
See 1 more Smart Citation
“…Gender can also play a role in the interactions that users have with technology. In one study, for example, female users tend to have less trust in automated agents (Hillesheim, Rusnock, Bindewald, & Miller, 2017). Finally, personality traits also have a bearing on users' tendency to trust others.…”
Section: A Review Of Variables That Impact Trustmentioning
confidence: 95%
“…From the limited prior research, we do know that response to AI is not universal. For example, consumers that are younger and male (Araujo et al, 2020) as well as that have higher levels of education (Hillesheim et al, 2017) have more positive perceptions of AI. Also, consumers with higher political identity salience view AI‐enabled services such as checkouts more favorably, and political ideology serves as a boundary condition in this relation; specifically, liberals prefer AI‐enabled checkouts over self‐service checkouts, whereas conservatives are indifferent (Cui & van Esch, 2022).…”
Section: Ai In Marketingmentioning
confidence: 99%