2021
DOI: 10.3389/fpsyg.2021.580955
|View full text |Cite
|
Sign up to set email alerts
|

Revisiting Human-Agent Communication: The Importance of Joint Co-construction and Understanding Mental States

Abstract: The study of human-human communication and the development of computational models for human-agent communication have diverged significantly throughout the last decade. Yet, despite frequently made claims of “super-human performance” in, e.g., speech recognition or image processing, so far, no system is able to lead a half-decent coherent conversation with a human. In this paper, we argue that we must start to re-consider the hallmarks of cooperative communication and the core capabilities that we have develop… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
23
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 33 publications
(24 citation statements)
references
References 84 publications
(102 reference statements)
1
23
0
Order By: Relevance
“…In general, however, one has to consider whether this sort of proactive or cooperative system behavior is possible under the paradigm of one-shot request-response interactions that can be seen in commercially available voice assistants today (Porcheron et al, 2018). We thus conjecture that future speech-based agents will require additional capabilities that would allow understanding the current interaction context and the mental states and knowledge level of the user, through some sort of joint co-construction and mentalizing occurring incrementally over the course of the interaction (Kopp and Krämer, 2021).…”
Section: Discussionmentioning
confidence: 99%
“…In general, however, one has to consider whether this sort of proactive or cooperative system behavior is possible under the paradigm of one-shot request-response interactions that can be seen in commercially available voice assistants today (Porcheron et al, 2018). We thus conjecture that future speech-based agents will require additional capabilities that would allow understanding the current interaction context and the mental states and knowledge level of the user, through some sort of joint co-construction and mentalizing occurring incrementally over the course of the interaction (Kopp and Krämer, 2021).…”
Section: Discussionmentioning
confidence: 99%
“…Several self-report measures have been created to assess users' experiences of presence and immersive tendencies (e.g., Witmer & Singer, 1998; for review, see Oh et al, 2018). Kopp and Krämer (2021) have considered whether these signals, originated in human-human analyses, apply well to human-agent communication.…”
Section: Detection and Measurement Of Engagementmentioning
confidence: 99%
“…As voice-based assistants fail in dialogues beyond one-shot interactions, there is a growing need and motivation to adapt aspects of the ToM concept for conversational assistants (Wang et al 2021;Kopp and Krämer 2021). Existing neural models for question answering do not succeed at false-belief tasks, such as the classic Sally-Anne-Experiment (Baron-Cohen, Leslie, and Frith 1985), as was shown in an article by Nematzadeh and colleagues (Nematzadeh et al 2018), where the researchers created a dataset of tasks that can be used for the evaluation of question answering neural models (such as memory networks, the examples of which were shown in chapter 3.1) with regards to belief reasoning.…”
Section: Modelling Knowledge About Beliefs For Tom In Human-agent Int...mentioning
confidence: 99%
“…However, when it comes to social interaction, it might not be enough to update mental states based on explicit actions of others in the world. People can change their mental state because of dialogues they have with others and it is important for conversational agents to be able to capture that as well (Kopp and Krämer 2021).…”
Section: Modelling Knowledge About Beliefs For Tom In Human-agent Int...mentioning
confidence: 99%
See 1 more Smart Citation