Abstract-Automation has been widely used in interactions with smartphones, computers, and other machinery in recent decades. Studies have shown that inappropriate reliance on automation can lead to unexpected and even catastrophic results. Trust is conceived as an intervening variable between user intention and actions involving reliance on automation. It is generally believed that trust is dynamic and an individual's culture or personality may influence automation use through changes in trust. To better understand how cultural and individual differences may affect a person's trust and resulting behaviors, the present study examined the effects of cultural characteristics and personality traits on reported trust in automation in U.S., Taiwanese and Turkish populations. The results showed individual differences significantly affected human trust in automation across the three cultures.
Abstract-The use of autonomous systems has been rapidly increasing in recent decades. To improve human-automation interaction, trust has been closely studied. Research shows trust is critical in the development of appropriate reliance on automation. To examine how trust mediates the human-automation relationships across cultures, the present study investigated the influences of cultural factors on trust in automation. Theoretically guided empirical studies were conducted in the U.S., Taiwan and Turkey to examine how cultural dynamics affect various aspects of trust in automation. The results found significant cultural differences in human trust attitude in automation.
The reported study compares groups of 120 participants each, from the United States, Taiwan, and Turkey interacting with versions of an automated path planner that vary in transparency and degree of automation. The nationalities were selected in accordance with the theory of Cultural Syndromes as representatives of Dignity (US), Face (Taiwan), and Honor (Turkey) cultures and were predicted to differ in readiness to trust automation, degree of transparency required to use automation, and willingness to use systems with high degrees of automation. Three experimental conditions were tested. In the first, highlight, path conflicts were highlighted leaving rerouting to the participant. In the second, re-planner made requests for permission to reroute when a path conflict was detected. The third combined condition increased transparency of the re-planner by combining highlighting with rerouting to make the conflict on which decision was based visible to the user. A novel framework relating transparency, stages of automation, and trust in automation is proposed in which transparency plays a primary role in decisions to use automation but is supplemented by trust where there is insufficient information otherwise. Hypothesized cultural effects and framework predictions were confirmed.
Trust in automation has become a topic of intensive study since the late 1990s and is of increasing importance with the advent of intelligent interacting systems. While the earliest trust experiments involved human interventions to correct failures/errors in automated control systems, a majority of subsequent studies have investigated information acquisition and analysis decision aiding tasks such as target detection for which automation reliability is more easily manipulated. Despite the high level of international dependence on automation in industry, almost all current studies have employed Western samples primarily from the U.S. The present study addresses these gaps by running a large sample experiment in three (U.S., Taiwan, and Turkey) diverse cultures using a "trust sensitive task" consisting of both automated control and target detection subtasks. This article presents results for the target detection subtask for which reliability and task load were manipulated. The current experiments allow us to determine whether reported effects are universal or specific to Western culture, vary in baseline or magnitude, or differ across cultures. Results generally confirm consistent effects of manipulations across the three cultures as well as cultural differences in initial trust and variation in effects of manipulations consistent with 10 cultural hypotheses based on Hofstede's Cultural Dimensions and Leung and Cohen's theory of Cultural Syndromes. These results provide critical implications and insights for correct trust calibration and to enhance human trust in intelligent automation systems across cultures. Additionally, our results would be useful in designing intelligent systems for users of different cultures. Our article presents the following contributions: First, to the best of our knowledge, this is the first set of studies that deal with cultural factors across all the cultural syndromes identified in the literature by comparing trust in the Honor, Face, Dignity cultures. Second, this is the first set of studies that uses a validated cross-cultural trust measure for measuring trust in automation. Third, our experiments are the first to study the dynamics of trust across cultures.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.