Several studies on how social robots respond, gesture, and display emotion in human-robot interactions have been conducted. In particular, sociality of robots implies that robots do not only exhibit human-like behaviors, but also display a tendency to adapt to a group of individuals. For robots to exhibit sociality, they need to adapt to group norms without telling them how to behave by the group members. In this study, we investigated the effect of group norms on human decision-making in human-robot groups, which comprise two robots using our proposed robotic model. Furthermore, we conducted quizzes with the robots and a human participant using unclear and vague answers. We assessed this influence by making the participant and the two robots repeat a set of actions: to answer the same quiz and recognize each answer of the group members. Additionally, we evaluated the extent to which the group norms changed the opinions of humans using a questionnaire. We analyzed the results of the questionnaire and chronological change in their answers for the quiz with the same question. The quiz experimental results showed that the human participants changed their answers after they discovered the answers of the robots for the first time due to social influence from the robots assumed that the human participants were confused about the diversity of the answers in the group and were aware of the consideration of the robots of the group norm. This is to ensure that they can adjust their answers to the group norm. Moreover, the questionnaire results revealed that the group norms gave the human participants right answers to the quiz that has no correct answers. Therefore, we concluded that robots attempt to comply with a group norm affects human's decision-making. INDEX TERMS Social robotics, human-robot interaction, human-robot group, social influence, group norm.
Herein, we proposed a robot model that will obey a norm of a certain group by interacting with the group members. Using this model, a robot system learns the norm of the group as a group member itself. The people with individual differences form a group and a characteristic norm that reflects the group members' personalities. When robots join a group that includes humans, the robots need to obey a characteristic norm: a group norm. We investigated whether the robot system generates a decision-making criterion to obey group norms by learning from interactions through reinforcement learning. In this experiment, human group members and the robot system answer same easy quizzes that could have several vague answers. When the group members answered differently from one another at first, we investigated whether the group members answered the quizzes while considering the group norm. To avoid bias toward the system's answers, one of the participants in a group only obeys the system, whereas the other participants are unaware of the system. Our experiments revealed that the group comprising the participants and the robot system forms group norms. The proposed model enables a social robot to make decisions socially in order to adjust their behaviors to common sense not only in a large human society but also in partial human groups, e.g., local communities. Therefore, we presumed that these robots can join human groups by interacting with its members. To adapt to these groups, these robots adjust their own behaviors. However, further studies are required to reveal whether the robots' answers affect people and whether the participants can form a group norm based on a robot's answer even in a situation wherein the participants recognize that they are interacting in a group that include a real robot. Moreover, some participants in a group do not know that the other participant only obeys the system's decisions and pretends to answer questions to prevent biased answers.
In this study, we investigate whether group norms occur in human–robot groups. At present, there are a number of studies that examine social robots’ ways of responding, gesturing, and displaying emotion. However, sociality implies that robots not only exhibit human-like behaviors, but also display the tendency to adapt to a group of individuals. For robots to exhibit sociality, they must adapt to group norms without being told by the group members how to behave. Group norms refer to the unwritten, unspoken, and informal rules that are present in a group of individuals. In a previous study, we demonstrated that a robot model learned group norms in human groups [1]. In the present study, we investigate whether group norms occur in human–robot groups. To this end, we prepared quizzes with unclear and vague answers, and instructed participants to take the quizzes with the robot. The results of the quiz experiments demonstrated that the robot considered group norms in human–robot groups when making decisions; thus, group norms occurred in human–robot groups.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.