As robots become commonplace, and for successful human-robot interaction to occur, people will need to trust them. Two experiments were conducted using the "minimal group paradigm" to explore whether social identity theory influences trust formation and impressions of a robot. In Experiment 1, participants were allocated to either a "robot" or "computer" group, and then they played a cooperative visual tracking game with an Aldebaran Nao humanoid robot as a partner. We hypothesised participants in the "robot group" would demonstrate intergroup bias by sitting closer to the robot (proxemics) and trusting the robot's suggested answers more frequently than their "computer group" counterparts. Experiment 2 used an almost identical procedure with a different set of participants; however, all participants were assigned to the "robot group" and thee different levels of anthropomorphic robot movement were manipulated. Our results suggest that intergroup bias and humanlike movement can significantly affect human-robot approach behaviour. Significant effects were found for trusting the robot's suggested answers with respect to task difficulty, but not for group membership or robot movement.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.