2016
DOI: 10.1111/bjet.12503
|View full text |Cite
|
Sign up to set email alerts
|

Retrospective cognitive feedback for progress monitoring in serious games

Abstract: Although the importance of cognitive feedback in digital serious games (DSG) is undisputed, we are facing some major design challenges. First of all, we do not know to which extend existing research guidelines apply when we stand the risk of cognitive feedback distorting the delicate balance between learning and playing. Unobtrusive cognitive feedback has to be interspersed with gameplay. Second, many effective solutions for providing cognitive feedback we do know might simply be too costly. To face both chall… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
9
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 19 publications
(10 citation statements)
references
References 37 publications
1
9
0
Order By: Relevance
“…We used 13 items of the validated Intrinsic Motivation Inventory (IMI) questionnaire (Ryan & Deci, 2000) to measure motivation . We have used nine items from a validated questionnaire (Nadolski & Hummel, 2017) to measure learnability , have used six items from the validated Usability Metric for User eXperience (UMUX) questionnaire (Lewis et al ., 2015) to measure usability, and have developed seven items to measure attitude towards playing games . In order to calculate the internal consistency and to compare average scores on all scales, all questionnaire items using All scaled responses to items were (re)calculated in the same direction (from “totally disagree” to “totally agree”), and maximum scores were standardized to 100% (or 100 points) maxima.…”
Section: Methodsmentioning
confidence: 99%
“…We used 13 items of the validated Intrinsic Motivation Inventory (IMI) questionnaire (Ryan & Deci, 2000) to measure motivation . We have used nine items from a validated questionnaire (Nadolski & Hummel, 2017) to measure learnability , have used six items from the validated Usability Metric for User eXperience (UMUX) questionnaire (Lewis et al ., 2015) to measure usability, and have developed seven items to measure attitude towards playing games . In order to calculate the internal consistency and to compare average scores on all scales, all questionnaire items using All scaled responses to items were (re)calculated in the same direction (from “totally disagree” to “totally agree”), and maximum scores were standardized to 100% (or 100 points) maxima.…”
Section: Methodsmentioning
confidence: 99%
“…Questionnaires: The (sixteen) MC items on psychology practice were developed by members of our own Psychology staff. We used all (23) items of the validated e-flow questionnaire ( [14]) to measure perceived flow in online learning, for this study we have developed (19) items to measure perceived authenticity, used all (13) items of the validated IMI questionnaire ( [15]) to measure motivation, have used (9) items from a validated questionnaire ( [16]) to measure learnability, have used two scales (6 items) from the validated UMUX questionnaire ( [17]) to measure usability, and finally have developed (7) items to measure attitude towards playing games. Cronbach's alphas found were 'good' to 'excellent' for all 5-point Likert scales, respectively α = .935 for flow, α = .950 for authenticity, α = .925 for motivation, α = .816 for learnability, α = .854 for usability, and α = .817 for attitude; and based on the variance of scores they all appear to have discriminative power.…”
Section: Methodsmentioning
confidence: 99%
“…Other scholars have investigated growth in learner understanding of specific problem‐solving situations (Ifenthaler et al, 2009; Nadolski & Hummel, 2017; Schlomske & Pirnay‐Dummer, 2008; Spector & Koszalka, 2004). For example, Ifenthaler et al (2009) investigated how knowledge structure of a problem situation can change through instruction.…”
Section: Design Frameworkmentioning
confidence: 99%
“…For example, Ifenthaler et al (2009) investigated how knowledge structure of a problem situation can change through instruction. Nadolski and Hummel (2017) experimented with a game‐based learning system that assessed a series of problem‐solving behaviors and provided cognitive feedback to guide complex skills acquisition in students. Similarly, we focused on learning progress in a problem‐solving task, a short‐term change that involves movement toward a solution and formative feedback catered to individual students.…”
Section: Design Frameworkmentioning
confidence: 99%