2023
DOI: 10.1093/jcr/ucad014
|View full text |Cite
|
Sign up to set email alerts
|

Machine Talk: How Verbal Embodiment in Conversational AI Shapes Consumer–Brand Relationships

Abstract: This research shows that AI-based conversational interfaces can have a profound impact on consumer-brand relationships. We develop a conceptual model of verbal embodiment in technology-mediated communication that integrates three key properties of human-to-human dialogue – (1) turn-taking (i.e., alternating contributions by the two parties), (2) turn-initiation (i.e., the act of initiating the next turn in a sequence), and (3) grounding between turns (i.e., acknowledging the other party’s contribution by resta… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7

Relationship

2
5

Authors

Journals

citations
Cited by 37 publications
(10 citation statements)
references
References 79 publications
0
10
0
Order By: Relevance
“…Second, zooming in on mind perception, our findings also provide a more nuanced understanding on how the attribution of a mind toward nonhuman agents goes beyond mere visual anthropomorphic appearance (Bergner et al, 2023;Schmitt, 2019;Yang et al, 2020). While mind perception can be considered an aspect of anthropomorphism, recent theoretical work suggests that anthropomorphism may exist on a continuum (Yang et al, 2020) ranging from shallow forms of anthropomorphism that is predominantly based on visual similarities to humans (e.g., facial features; Aggarwal & McGill, 2007) to deeper forms of anthropomorphism based on similar human mental capacities, such as the ability to think and have a conscious experience (Gray et al, 2007;.…”
Section: Conceptual and Substantive Contributionsmentioning
confidence: 83%
See 1 more Smart Citation
“…Second, zooming in on mind perception, our findings also provide a more nuanced understanding on how the attribution of a mind toward nonhuman agents goes beyond mere visual anthropomorphic appearance (Bergner et al, 2023;Schmitt, 2019;Yang et al, 2020). While mind perception can be considered an aspect of anthropomorphism, recent theoretical work suggests that anthropomorphism may exist on a continuum (Yang et al, 2020) ranging from shallow forms of anthropomorphism that is predominantly based on visual similarities to humans (e.g., facial features; Aggarwal & McGill, 2007) to deeper forms of anthropomorphism based on similar human mental capacities, such as the ability to think and have a conscious experience (Gray et al, 2007;.…”
Section: Conceptual and Substantive Contributionsmentioning
confidence: 83%
“…How to cite this article: Hartmann, J., Bergner, A., & Hildebrand, C. (2023). MindMiner: Uncovering linguistic markers of mind perception as a new lens to understand consumer-smart object relationships.…”
Section: Conc Lusionmentioning
confidence: 99%
“…In multiturn interactions, interlocutors engage in back-and-forth turn-taking, taking information received into account for the following turn (e.g., Burggräf et al 2022; Ke et al 2022). For instance, conversations with text or voice-based chatbots constitute instances of multiturn interactions (Bergner, Hildebrand, and Häubl 2019; Dellaert et al 2020; Luo et al 2019). In these types of interactions, technology acts as a conversational agent.…”
Section: Verbal Disclosure In Oral Versus Manual Interactions With Te...mentioning
confidence: 99%
“…Relatedly, presenting product choices through a voice-based (vs. text-based) interaction can increase cognitive difficulty in information processing, leading to detrimental effects for consumers (Munz 2020; for a more detailed review on the effects of interacting with different modalities, see King, Auschaitrakul, and Lin [2022]). However, enhancing the verbal abilities of conversational agents, such as increasing the extent of signaling mutual understanding or grounding, can lead to more intimate consumer-brand interactions (Bergner, Hildebrand, and Häubl 2023).…”
Section: Theoretical Backgroundmentioning
confidence: 99%