Search citation statements
Paper Sections
Citation Types
Year Published
Publication Types
Relationship
Authors
Journals
We consider the pairing of audio chatbot technologies with voice-based deep fakes, that is, voice clones, examining the potential of this combination to induce consumer trust. We report on a set of controlled experiments based on the investment game, evaluating how voice cloning and chatbot disclosure jointly affect participants’ trust, reflected by their willingness to play with an autonomous, AI-enabled partner. We observe evidence that voice-based agents garner significantly greater trust from subjects when imbued with a clone of the subject’s voice. Recognizing that these technologies present not only opportunities but also the potential for misuse, we further consider the moderating impact of AI disclosure, a recent regulatory proposal advocated by some policymakers. We find no evidence that AI disclosure attenuates the trust-inducing effect of voice clones. Finally, we explore underlying mechanisms and contextual moderators for the trust-inducing effects, with an eye toward informing future efforts to manage and regulate voice-cloning applications. We find that a voice clone’s effects operate, at least in part, by inducing a perception of homophily and that the effects are increasing in the clarity and quality of generated audio. Implications of these results for consumers, policymakers, and society are discussed. This paper has been This paper was accepted by D. J. Wu for the Special Issue on the Human-Algorithm Connection. Funding: This work was supported by funding from the University of Wisconsin-Milwaukee Research Assistance Fund. Supplemental Material: The online appendix and data files are available at https://doi.org/10.1287/mnsc.2022.03316 .
We consider the pairing of audio chatbot technologies with voice-based deep fakes, that is, voice clones, examining the potential of this combination to induce consumer trust. We report on a set of controlled experiments based on the investment game, evaluating how voice cloning and chatbot disclosure jointly affect participants’ trust, reflected by their willingness to play with an autonomous, AI-enabled partner. We observe evidence that voice-based agents garner significantly greater trust from subjects when imbued with a clone of the subject’s voice. Recognizing that these technologies present not only opportunities but also the potential for misuse, we further consider the moderating impact of AI disclosure, a recent regulatory proposal advocated by some policymakers. We find no evidence that AI disclosure attenuates the trust-inducing effect of voice clones. Finally, we explore underlying mechanisms and contextual moderators for the trust-inducing effects, with an eye toward informing future efforts to manage and regulate voice-cloning applications. We find that a voice clone’s effects operate, at least in part, by inducing a perception of homophily and that the effects are increasing in the clarity and quality of generated audio. Implications of these results for consumers, policymakers, and society are discussed. This paper has been This paper was accepted by D. J. Wu for the Special Issue on the Human-Algorithm Connection. Funding: This work was supported by funding from the University of Wisconsin-Milwaukee Research Assistance Fund. Supplemental Material: The online appendix and data files are available at https://doi.org/10.1287/mnsc.2022.03316 .
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.