2017
DOI: 10.1007/s11257-017-9188-z
|View full text |Cite
|
Sign up to set email alerts
|

Affective learning: improving engagement and enhancing learning with affect-aware feedback

Abstract: This paper describes the design and ecologically valid evaluation of a learner model that lies at the heart of an intelligent learning environment called iTalk2Learn. A core objective of the learner model is to adapt formative feedback based on students' a↵ective states. Types of adaptation include what type of formative feedback should be provided and how it should be presented. Two Bayesian networks trained with data gathered in a series of Wizard-of-Oz studies are used for the adaptation process. This paper… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
50
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
6
3

Relationship

2
7

Authors

Journals

citations
Cited by 71 publications
(51 citation statements)
references
References 48 publications
1
50
0
Order By: Relevance
“…Multimodality has been studied for around three decades in the context of social semiotics and its potential to help us understand the world around us led AI researchers to try and build models that can process information from multiple modalities through machine learning and social signal processing (Vinciarelli, Pantic, & Bourlard, 2009). The literature on multimodal prediction is rich with examples of audio-visual speech recognition (Zhou & De la Torre, 2012); multimedia content indexing and retrieval (Atrey, Hossain, El Saddik, & Kankanhalli, 2010), and multimodal affect recognition (D'mello & Kory, 2015;Grawemeyer et al, 2017). Learning from multimodal data provides opportunities to gain an in-depth understanding of complex processes and, for AI research to make progress, it should focus on multimodal AI models that can process and relate information from multiple modalities (Baltrušaitis, Ahuja, & Morency, 2019).…”
mentioning
confidence: 99%
“…Multimodality has been studied for around three decades in the context of social semiotics and its potential to help us understand the world around us led AI researchers to try and build models that can process information from multiple modalities through machine learning and social signal processing (Vinciarelli, Pantic, & Bourlard, 2009). The literature on multimodal prediction is rich with examples of audio-visual speech recognition (Zhou & De la Torre, 2012); multimedia content indexing and retrieval (Atrey, Hossain, El Saddik, & Kankanhalli, 2010), and multimodal affect recognition (D'mello & Kory, 2015;Grawemeyer et al, 2017). Learning from multimodal data provides opportunities to gain an in-depth understanding of complex processes and, for AI research to make progress, it should focus on multimodal AI models that can process and relate information from multiple modalities (Baltrušaitis, Ahuja, & Morency, 2019).…”
mentioning
confidence: 99%
“…However, relatively few of these models have actually been built into running systems, and fewer still have been used to drive affective intervention, as noted in a review by D'Mello and his colleagues (D'Mello et al 2014). There has been additional work since thensee, for instance, Grawemeyer et al (2017) -but the relatively small number of examples since then represents an argument that this combination of factors is difficult to bring together.…”
Section: Gift As a Testbed To Build And Embed Affect Sensitivitymentioning
confidence: 99%
“…AutoTutor responded to negative student affect with encouraging and supportive messages; a randomized experiment determined that it led to better learning outcomes for learners with initial low domain knowledge. In related work within a speech-based intelligent tutor, Grawemeyer et al (2017) found that affective support based on automated detection of student affect reduced boredom and off-task behavior.…”
Section: Gift As a Testbed To Build And Embed Affect Sensitivitymentioning
confidence: 99%
“…The analysis of the speech and students' interaction with the exploratory learning environment are used to detect their affective states. Adaptive support is provided based on students' affective states [4].…”
Section: User Studymentioning
confidence: 99%