2013
DOI: 10.1080/00140139.2013.822566
|View full text |Cite
|
Sign up to set email alerts
|

Similarities and differences of emotions in human–machine and human–human interactions: what kind of emotions are relevant for future companion systems?

Abstract: Cognitive-technical intelligence is envisioned to be constantly available and capable of adapting to the user's emotions. However, the question is: what specific emotions should be reliably recognised by intelligent systems? Hence, in this study, we have attempted to identify similarities and differences of emotions between human-human (HHI) and human-machine interactions (HMI). We focused on what emotions in the experienced scenarios of HMI are retroactively reflected as compared with HHI. The sample consiste… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 26 publications
(13 citation statements)
references
References 30 publications
0
13
0
Order By: Relevance
“…However, finetuned trust regulation mechanisms are useful in overcoming such trust violations (de Visser et al, 2016), allowing for feasible adjustments like trust calibration in contrast to costly solutions like technology disregard. Another novel challenge for trust evaluations comes from computers often being endowed with human-like capabilities that have been shown to bring forth natural human responses (Sidner et al, 2005;Walter et al, 2014;Krämer et al, 2015). In numerous studies, people repeatedly responded to computers as if they were social actors and applied social scripts, norms, and attributions to them (Nass et al, 1994(Nass et al, , 1995.…”
Section: Theoretical Backgroundmentioning
confidence: 99%
“…However, finetuned trust regulation mechanisms are useful in overcoming such trust violations (de Visser et al, 2016), allowing for feasible adjustments like trust calibration in contrast to costly solutions like technology disregard. Another novel challenge for trust evaluations comes from computers often being endowed with human-like capabilities that have been shown to bring forth natural human responses (Sidner et al, 2005;Walter et al, 2014;Krämer et al, 2015). In numerous studies, people repeatedly responded to computers as if they were social actors and applied social scripts, norms, and attributions to them (Nass et al, 1994(Nass et al, , 1995.…”
Section: Theoretical Backgroundmentioning
confidence: 99%
“…The adjectives used for our investigation were justified by related work and deliberately chosen based on recent research. Our choice is based on work of Walter et al [27], who explicitly focus on scenarios involving human-machine interactions and human-human interactions. Table 3 lists the pairs of opposites comprised by the semantic differential.…”
Section: Design Of the Questionnairementioning
confidence: 99%
“…For instance, emotion recognition by facial expression, which aims to model visually distinguishable facial movements [3]; by speech, for which researchers utilize acoustic features such as pitch, intensity, duration, and spectral data [4]; and by physiological data, such as the heart rate and sweat [5]. In the past two decades, substantial amount of research with regard to affective computing has been conducted in the field of Human-Computer Interaction (HCI) [1, 622], and has also been recognized by the application field (e.g., in the tutoring system research [23–35]).…”
Section: Introductionmentioning
confidence: 99%