Proceedings of the 2016 Annual Symposium on Computer-Human Interaction in Play Companion Extended Abstracts 2016
DOI: 10.1145/2968120.2987750
|View full text |Cite
|
Sign up to set email alerts
|

Assessing Computational Thinking in Students' Game Designs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0
1

Year Published

2017
2017
2023
2023

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 20 publications
(11 citation statements)
references
References 10 publications
0
10
0
1
Order By: Relevance
“…The evaluation task was given in the form of an essay with 16 numbers, consisting of 8 CT skill indicators. According to Hoover et al (2016), CT assessments can potentially be automatic to encourage the development of CT skills.…”
Section: Resultsmentioning
confidence: 99%
“…The evaluation task was given in the form of an essay with 16 numbers, consisting of 8 CT skill indicators. According to Hoover et al (2016), CT assessments can potentially be automatic to encourage the development of CT skills.…”
Section: Resultsmentioning
confidence: 99%
“…It also provides instant feedback and acts as a tutorial about how one can improve his program, which makes it especially adequate for self-assessment. Hoover et al (2016) believe that automated assessment of CT can potentially encourage CT development. However, Dr. Scratch only considers the complexity of programs, not their meaning.…”
Section: Assessment Of Computational Thinkingmentioning
confidence: 99%
“…al. [200] who found that the quantitative (Dr. Scratch) and qualitative results differed greatly. Students designed games based around climate change and although Dr. Scratch gave similar scores based on its metrics, more complex game design and more realistic representations of climate change were found in each game (these were taken as indicators of higher levels of CT based on [20]).…”
Section: Introductionmentioning
confidence: 99%