2019
DOI: 10.1037/aca0000227
|View full text |Cite
|
Sign up to set email alerts
|

Scoring divergent thinking tests: A review and systematic framework.

Abstract: Divergent thinking tests are often used in creativity research as measures of creative potential. However, measurement approaches across studies vary to a great extent. One facet of divergent thinking measurement that contributes strongly to differences across studies is the scoring of participants' responses. Most commonly, responses are scored for fluency, flexibility, and originality. However, even with respect to only one dimension (e.g., originality), scoring decisions vary extensively. In the current wor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

7
346
1
3

Year Published

2019
2019
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 298 publications
(357 citation statements)
references
References 49 publications
7
346
1
3
Order By: Relevance
“…Although each method has shown some degree of utility for creativity research, each comes with challenges and limitations. Two challenges that are common to most creativity assessments are subjectivity (raters don't always agree on what's creative) and labor cost (raters often have to score thousands of responses by hand)-both of which pose threats to the reliable and valid assessment of creativity (Barbot, 2018;Forthmann, Holling, Zandi, et al, 2017;Reiter-Palmon, Forthmann, & Barbot, 2019). To address these issues, researchers have begun to explore whether the process of scoring responses for their creative quality can be automated and standardized using computational methods, and preliminary evidence suggests that such tools can yield reliable and valid indices of creativity (Acar & Runco, 2014;Dumas, Organisciak, & Doherty, 2020;Heinen & Johnson, 2018;Kenett, 2019;Prabhakaran, Green, & Gray, 2014).…”
Section: An Open Platform For Computing Semantic Distancementioning
confidence: 99%
“…Although each method has shown some degree of utility for creativity research, each comes with challenges and limitations. Two challenges that are common to most creativity assessments are subjectivity (raters don't always agree on what's creative) and labor cost (raters often have to score thousands of responses by hand)-both of which pose threats to the reliable and valid assessment of creativity (Barbot, 2018;Forthmann, Holling, Zandi, et al, 2017;Reiter-Palmon, Forthmann, & Barbot, 2019). To address these issues, researchers have begun to explore whether the process of scoring responses for their creative quality can be automated and standardized using computational methods, and preliminary evidence suggests that such tools can yield reliable and valid indices of creativity (Acar & Runco, 2014;Dumas, Organisciak, & Doherty, 2020;Heinen & Johnson, 2018;Kenett, 2019;Prabhakaran, Green, & Gray, 2014).…”
Section: An Open Platform For Computing Semantic Distancementioning
confidence: 99%
“…Third, I measured creativity with a divergent thinking task. Although this is a common way to measure creativity, this type of task more accurately measures creative potential (Runco, 2010), or the capacity for idea generation (Reiter-Palmon et al, 2019). Moreover, there are many considerations to be made when using divergent thinking tests (e.g., instructions, time, and scoring, etc.…”
Section: Limitations and Future Researchmentioning
confidence: 99%
“…Indeed, usually, divergent-exploratory thinking tasks are scored for fluency, flexibility and originality (Reiter-Palmon et al, 2019) and these scores are interrelated (Forthmann et al, 2018a(Forthmann et al, , 2018bNusbaum & Silvia, 2011). In this study, our divergent thinking tasks were scored according to the EPoC Manual (Lubart et al, 2011) only with fluency (number of different ideas) and the level of flexibility of the participants has not been measured yet, nor was the originality.…”
Section: Discussionmentioning
confidence: 97%
“…Observations have shown that the more ideas people have (reflecting ideational fluency), the greater the probability they will obtain high-quality ideas. This explains why many psychometric measures of creative potential only derive a score of ideational fluency based on the number of ideas generated (Lubart et al, 2015;Reiter-Palmon, Forthman & Barbot, 2019). Divergentexploratory thinking is often considered as an initial phase of the creative process (Csikszentmihalyi, 2006;Runco, 2008).…”
Section: Creativitymentioning
confidence: 99%