Little is known regarding public opinion of autonomous robots. Trust of these robots is a pertinent topic as this construct relates to one’s willingness to be vulnerable to such systems. The current research examined gender-based effects of trust in the context of an autonomous security robot. Participants (N = 200; 63% male) viewed a video depicting an autonomous guard robot interacting with humans using Amazon’s Mechanical Turk. The robot was equipped with a non-lethal device to deter non-authorized visitors and the video depicted the robot using this non-lethal device on one of the three humans in the video. However, the scenario was designed to create uncertainty regarding who was at fault – the robot or the human. Following the video, participants rated their trust in the robot, perceived trustworthiness of the robot, and their desire to utilize similar autonomous robots in several different contexts that varied from military use to commercial use to home use. The results of the study demonstrated that females reported higher trust and perceived trustworthiness of the robot relative to males. Implications for the role of individual differences in trust of robots are discussed.
Objective This research examined the effects of reliability and stated social intent on trust, trustworthiness, and one’s willingness to endorse use of an autonomous security robot (ASR). Background Human–robot interactions in the domain of security is plausible, yet we know very little about what drives acceptance of ASRs. Past research has used static images and game-based simulations to depict the robots versus actual humans interacting with actual robots. Method A video depicted an ASR interacting with a human. The ASR reviewed access credentials and allowed entrance once verified. If the ASR could not verify one’s credentials it instructed the visitor to return to the security checkpoint. The ASR was equipped with a nonlethal device and the robot used this device on one of the three visitors (a research confederate). Manipulations of reliability and stated social intent of the ASR were used in a 2 × 4 between subjects design ( N = 320). Results Reliability influenced trust and trustworthiness. Stated social intent influenced trustworthiness. Participants reported being more favorable toward use of the ASR in military contexts versus public contexts. Conclusion The study demonstrated that reliability of the ASR and statements regarding the ASR’s stated social intent are important considerations influencing the trust process (inclusive of intentions to be vulnerable and trustworthiness perceptions). Application If robotic systems are authorized to use force against a human, public acceptance may be increased with availability of the intent-based programming of the robot and whether or not the robot’s decision was reliable.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.