2017
DOI: 10.1002/hbm.23851
|View full text |Cite
|
Sign up to set email alerts
|

Affective creativity meets classic creativity in the scanner

Abstract: The investigation of neurocognitive processes underlying more real-life creative behavior is among the greatest challenges in creativity research. In this fMRI study, we addressed this issue by investigating functional patterns of brain activity while participants were required to be creative in an affective context. Affective creativity was assessed in terms of individual's inventiveness in generating alternative appraisals for anger-evoking events, which has recently emerged as a new ability concept in cogni… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

3
39
0
2

Year Published

2018
2018
2022
2022

Publication Types

Select...
8

Relationship

4
4

Authors

Journals

citations
Cited by 43 publications
(44 citation statements)
references
References 96 publications
(167 reference statements)
3
39
0
2
Order By: Relevance
“…In MTCI, this feature is proposed in the intent to (1) limit stimulus-dependency (range of stimuli offered), and (2), control the number of responses generated (addressing fluency-originality dependency and trade-off; Zarnegar et al, 1988). Although such multi-trial single-response formats showed high reliability, predictive validity (Prabhakaran et al, 2014) and convergent validity with multi-response tasks (Perchtold et al, 2018), it has been criticized for its loss of open-endedness and potential for tracking iterative CI processes (Mouchiroud and Lubart, 2001; Hass, 2017). Yet, while both formats engage DT, observable responses uncover only one’s reported ideas which, as noted above, is insufficient to genuinely track the time-course of CI.…”
Section: Mtci Frameworkmentioning
confidence: 99%
See 1 more Smart Citation
“…In MTCI, this feature is proposed in the intent to (1) limit stimulus-dependency (range of stimuli offered), and (2), control the number of responses generated (addressing fluency-originality dependency and trade-off; Zarnegar et al, 1988). Although such multi-trial single-response formats showed high reliability, predictive validity (Prabhakaran et al, 2014) and convergent validity with multi-response tasks (Perchtold et al, 2018), it has been criticized for its loss of open-endedness and potential for tracking iterative CI processes (Mouchiroud and Lubart, 2001; Hass, 2017). Yet, while both formats engage DT, observable responses uncover only one’s reported ideas which, as noted above, is insufficient to genuinely track the time-course of CI.…”
Section: Mtci Frameworkmentioning
confidence: 99%
“…Neuroscience studies have often adapted DT tasks in a way that separates CI from response production time generally confounded in DT scores (Benedek et al, in press). Regrettably, it has resulted in overly constrained paradigms, imposing rigid time-structures for different phases of CI (e.g., 15 s “think time”, 10 s response time; Ellamil et al, 2012; Perchtold et al, 2018; Rominger et al, 2018) or requiring subjects to actively “signal” an idea (Heinonen et al, 2016; Boot et al, 2017). Consistent with recent computerized assessments (Hart et al, 2017; Loesche et al, 2018), log-analysis of test-takers’ interactions with MTCI tasks can inform a more realistic chronology of broad, qualitatively distinct phases of CI (Figure 1): (1) Exploration – response formulation, or “thinking” phase, measured by the time between stimulus presentation (timestamp a) and the onset of the response marked by the first interaction with the digital-platform (e.g., screen-touch, or typing; timestamp b) – (2) Production : response production phase, measured by the time between the first (timestamp b) and the last (timestamp c) interaction with the platform in producing the response (e.g., finger-doodling for graphic responses, typing text for verbal responses) – (3) Verification : “control” phase in which the produced response is being validated or discarded, measured by the time between the last interaction to produce the response (timestamp c), and the action (e.g., click) to validate the response/move on to next item (timestamp d).…”
Section: Mtci Frameworkmentioning
confidence: 99%
“…As a result of this growing interest, there are an increasing number of cognitive neuroscience studies attempting to unveil potential brain mechanisms associated with creative ideation (see, e.g., Arden, Chavez, Grazioplene, & Jung, 2010; Fink & Benedek, 2014; Jung & Vartanian, 2018; Pidgeon et al, 2016). Though research in this field is still at an early stage, considerable progress has been achieved in unveiling relevant brain mechanisms of divergent thinking by means of event- and task-related (de)synchronization of power in the EEG alpha band (TRP; e.g., Benedek, Bergner, Könen, Fink, & Neubauer, 2011; Benedek, Schickel, Jauk, Fink, & Neubauer, 2014; Fink, Benedek, Grabner, Staudt, & Neubauer, 2007; Fink, Graif, & Neubauer, 2009; Fink, Rominger et al, 2018; Jausovec, 2000; Rominger, Papousek, Perchtold et al, 2018; for an overview see Fink & Benedek, 2014).…”
Section: Introductionmentioning
confidence: 99%
“…For the present investigation, a single answer (i.e., best idea) and self-paced version of the AU task was used. This approach was applied in order to more strongly focus on the originality aspect of creativity [26]. The self-paced procedure appropriately captures the spontaneous nature of the creative thinking process [7175]; however, it also implicates high demands on self-regulatory mechanisms.…”
Section: Methodsmentioning
confidence: 99%
“…Each trial started with a white cross (10 s), followed by a picture of a common object (idea generation phase with a max. response time of 15 s; see Perchtold et al [26] for a similar procedure). After the “idea button” was pressed, the participants rated the originality of their idea on a 6-point Likert-scale (max.…”
Section: Methodsmentioning
confidence: 99%