Previous research suggests that the increased attribution of agency to robots may be linked to negative attitudes toward robots. If robots are truly expected to assume various roles in our social environment, it is necessary to further explore how increasing agency, for example through increasing levels of autonomy, affects attitudes toward them. This study investigates the role of perceived control as a moderator explaining attitudes toward attributed agency in a collaboration context. Austrian-based participants ($$\hbox {N}=102$$ N = 102 ) watched a video of a robot collaborating with a person to assemble a mixer—the robot was presented as either agentic and capable of proactively collaborating with the human or non-agentic and only capable of following human commands. The results show that attributing high levels of agency to robots is associated with negative attitudes toward them when individuals perceive low control during the collaboration.
Collaborative industrial robotic arms (cobots) are integrated industrial assembly systems relieving their human coworkers from monotonous tasks and achieving productivity gains. The question of task allocation arises in the organization of these human-robot interactions. State of the art shows static, compensatory task allocation approaches in current assembly systems and flexible, adaptive task sharing (ATS) approaches in human factors research. The latter should exploit the economic and ergonomic advantages of cobot usage. Previous research results did not provide a clear insight into whether industrial workers prefer static or adaptive task allocation and which tasks workers do prefer to assign to cobots. Therefore, we set up a cobot demonstrator with a realistic industrial assembly use case and did a user study with experienced workers from the shop floor (n = 25). The aim of the user study is to provide a systematic understanding and evaluation of workers' preferences in a practical context of humanrobot interaction (HRI) in assembly. Our main findings are that participants preferred the ATS concept to a predetermined task allocation and reported increased satisfaction with the allocation.Results show that participants are more likely to give manual tasks to the cobot in contrast to cognitive tasks. It shows that workers do not entrust all tasks to robots, but like to take over cognitive tasks by themselves. This work contributes to the design of human-centered HRI in industrial assembly systems.
Strategies for improving the explainability of artificial agents are a key approach to support the understandability of artificial agents’ decision-making processes and their trustworthiness. However, since explanations are not inclined to standardization, finding solutions that fit the algorithmic-based decision-making processes of artificial agents poses a compelling challenge. This paper addresses the concept of trust in relation to complementary aspects that play a role in interpersonal and human–agent relationships, such as users’ confidence and their perception of artificial agents’ reliability. Particularly, this paper focuses on non-expert users’ perspectives, since users with little technical knowledge are likely to benefit the most from “post-hoc”, everyday explanations. Drawing upon the explainable AI and social sciences literature, this paper investigates how artificial agent’s explainability and trust are interrelated at different stages of an interaction. Specifically, the possibility of implementing explainability as a trust building, trust maintenance and restoration strategy is investigated. To this extent, the paper identifies and discusses the intrinsic limits and fundamental features of explanations, such as structural qualities and communication strategies. Accordingly, this paper contributes to the debate by providing recommendations on how to maximize the effectiveness of explanations for supporting non-expert users’ understanding and trust.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.