2010
DOI: 10.1075/nlp.8.19bee
|View full text |Cite
|
Sign up to set email alerts
|

The use of affective and attentive cues in an empathic computer-based Companions

Abstract: Recently, a number of research projects have been started to create virtual agents that do not just serve as assistants to which tasks may be delegated, but that may even take on the role of a companion. Such agents require a great deal of social intelligence, such as the ability to detect the user's affective state and to respond to it in an empathic manner. The objective of our work is to create an empathetic listener that is capable to react on affective and attentive input cues of the user. In particular, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2011
2011
2022
2022

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 9 publications
(6 citation statements)
references
References 15 publications
0
6
0
Order By: Relevance
“…In our scenario we used two of them: feeling 'sad' for the visitor's loss-which triggers the robot's prompt reaction to help recovering the bag-and feeling 'happy' once the bag is found. Hence, the robot response strategy is to mimic the user's affective state (parallel empathy) and to offer immediately help (reactive empathy) [60].…”
Section: Designing Empathetic Reactions: I Can Feel What Youmentioning
confidence: 99%
“…In our scenario we used two of them: feeling 'sad' for the visitor's loss-which triggers the robot's prompt reaction to help recovering the bag-and feeling 'happy' once the bag is found. Hence, the robot response strategy is to mimic the user's affective state (parallel empathy) and to offer immediately help (reactive empathy) [60].…”
Section: Designing Empathetic Reactions: I Can Feel What Youmentioning
confidence: 99%
“…For example, if a user is distressed because he or she was not able to solve a task in a tutoring system, an artificial agent would simply imitate the user's facial expression without knowing why the user is distressed. This is one of the behaviors realized by Bee et al [6] with the attentive listener agent Alfred. They used EmoVoice [45] to detect emotional cues in the speaker's voice, from which they calculated the user's current mood tendency via the ALMA model of affect [16].…”
Section: Modeling and Simulating Empathymentioning
confidence: 78%
“…An example includes the work by Bee et al [6] who implemented affective empathy for the virtual Alfred agent by appraising the emotional state inferred from the user's tone of voice. Based on the OCC model [33], the agent perceived negative emotions as "bad event for good other" and positive emotions as "good event for good other".…”
Section: Modeling and Simulating Empathymentioning
confidence: 99%
“…But at the human/sex-robot level, parallels to the unconscious emotional empathy may be all that is sought. The exception is with speech recognition and conversation (Bee et al 2010). In this case, it is not only linguistic exchange that is involved; body movements and gestures are important as well (Novikova and Watts 2015).…”
Section: The Uncanny Valleymentioning
confidence: 99%