2018 15th International Conference on Ubiquitous Robots (UR) 2018
DOI: 10.1109/urai.2018.8441775
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive Framework for Emotional Engagement in Child-Robot Interactions for Autism Interventions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(23 citation statements)
references
References 10 publications
0
23
0
Order By: Relevance
“…Rudovic et al ( 2018 ) implemented deep learning in a robot for ASD therapy to automatically estimate children's valence, arousal and engagement levels. Javed et al ( 2018 ) utilized multimodal perception, including the analyses of children's motion, speech, and facial expression, to estimate children's emotional states. Using multiple feedback modalities may overload users with redundant information, increase task completion time, and reduce the efficiency of cognitive training (Taranović et al, 2018c ).…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Rudovic et al ( 2018 ) implemented deep learning in a robot for ASD therapy to automatically estimate children's valence, arousal and engagement levels. Javed et al ( 2018 ) utilized multimodal perception, including the analyses of children's motion, speech, and facial expression, to estimate children's emotional states. Using multiple feedback modalities may overload users with redundant information, increase task completion time, and reduce the efficiency of cognitive training (Taranović et al, 2018c ).…”
Section: Resultsmentioning
confidence: 99%
“…For example, Tsiakas et al ( 2016 ) used interactive reinforcement learning methods to facilitate the adaptive robot-assisted therapy, that is, adapt the task difficulty level and task duration to users with different skill levels (e.g., expert or novice user), in the context that users need to perform a set of cognitive or physical training tasks. Javed et al ( 2018 ) developed a Hidden Markov model (HMM) in their adaptive framework for child-robot interaction, aiming to enable a child with ASD to engage in robot-assisted ASD therapy over long term. In their HMM, the states took into consideration a child's emotional state or mood, and the actions were the robot's behaviors or other audiovisual feedback.…”
Section: Resultsmentioning
confidence: 99%
“…Children with ASD [10], [12], [13], [15]- [17], [19], [21]- [43] Children with autism [7]- [9], [12], [13], [19], [20], [34], [37], [41], [42], [44]- [47] Autistic children [8], [15], [19], [20], [36], [46], [48]- [52] Children on/with autism spectrum [22], [53] ASD children [54] Autism children [51] TD children [10], [12], [19], [21], [24], [26], [43], [47], [52] Normal children [8], [20], [36], [49], [51] Neurotypical children [13], [16], [23], [50] Typical individuals…”
Section: Wording Articlesmentioning
confidence: 99%
“…Articles Humanoid NAO [7], [8], [13], [15], [17], [20], [22], [24], [25], [32], [36], [37], [39]- [42], [47]- [49], [51], [55], [56] ZECA [12], [33], [50], [52], [54], [57] Darwin-Mini [7], [16], [23], [43], [58] Kaspar [9], [28], [37], [59] iRobiQ [30], [44], [45] CARO [30], [44], [ [34] nipulator with a basketball hoop attached to its end-effector. RBB displays different affective states by moving the hoop in 3D with different speed levels, accompanied with soft background music [34].…”
Section: Robot Type Robotmentioning
confidence: 99%
“…A method for the automatic classification of engagement with a dynamic Bayesian network using visual and acoustic cues and support vector machine classifiers is described in [24]. Another approach considers the facial expressions of children with ASD to evaluate their engagement [25]. A robot-mediated joint attention intervention system using as input the child's gaze is presented in [26].…”
Section: Introductionmentioning
confidence: 99%