2008
DOI: 10.3758/cabn.8.1.41
|View full text |Cite
|
Sign up to set email alerts
|

Affective priming of emotional pictures in parafoveal vision: Left visual field advantage

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

4
16
1

Year Published

2009
2009
2017
2017

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 24 publications
(21 citation statements)
references
References 57 publications
4
16
1
Order By: Relevance
“…It is possible that asymmetries were not observed because emotion was incidental to the task, which required participants only to distinguish intact from scrambled stimuli. Many studies that have produced evidence for emotional asymmetries use explicit emotional identification or judgments [6][7][8], or involve tasks in which emotion is relevant to response [53]. We purposefully made emotion itself task-irrelevant so that we could observe the effects of emotion on control processes independent of any effects on motor execution (e.g., approach and avoidance tendencies that might have been activated by positive and negative stimuli, respectively) [7].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…It is possible that asymmetries were not observed because emotion was incidental to the task, which required participants only to distinguish intact from scrambled stimuli. Many studies that have produced evidence for emotional asymmetries use explicit emotional identification or judgments [6][7][8], or involve tasks in which emotion is relevant to response [53]. We purposefully made emotion itself task-irrelevant so that we could observe the effects of emotion on control processes independent of any effects on motor execution (e.g., approach and avoidance tendencies that might have been activated by positive and negative stimuli, respectively) [7].…”
Section: Discussionmentioning
confidence: 99%
“…Our question concerns the effects of lateralisation of the emotional stimuli on response inhibition processes. Importantly, emotional information can be extracted from peripherally-presented complex scenes, even with very brief stimulus presentations [52,53]. We therefore determined whether emotion affected either behavioural or electrophysiological measures of the response inhibition.…”
Section: Introductionmentioning
confidence: 99%
“…Evidence of emotional processing of extrafoveally presented visual scenes, i.e., outside the focus of overt attention and prior to eye fixations, has been obtained with studies of electrocortical brain activity, with affective modulations of early and late ERPs (event-related potentials;De Cesarei, Codispoti, & Schupp, 2009;Keil, Moratti, Sabatinelli, Bradley, & Lang, 2005;Rigoulot et al, 2008), and also studies using recognition (Calvo & Lang, 2005;Calvo, Nummenmaa, & Hyönä, 2008) and affective priming (Calvo & Avero, 2008;Calvo & Nummenmaa, 2007) measures. Relatedly, eye-movement studies have revealed (a) greater attentional capture by emotional relative to neutral scenes, when they are presented alone (Kissler & Keil, 2008) or simultaneously (Alpers, 2008;McSorley & van Reekum, 2013;Nummenmaa, Hyönä, & Calvo, 2006 in extrafoveal vision; and (b) selective orienting to extrafoveal scene areas depicting emotional objects relative to non-emotional objects within the same scene (Humphrey, Underwood, & Lambert, 2012;Niu, Todd, & Anderson, 2012;Pilarczyk & Kuniecki, 2014).…”
Section: Introductionmentioning
confidence: 99%
“…Research investigating the links between affect and perception most commonly relies on object, face, or scene stimuli that generate strong, well-defined valences (Greenwald et al, 1998; Avero and Calvo, 2006; Calvo and Avero, 2008; Rudrauf et al, 2008; Colibazzi et al, 2010; Weierich et al, 2010). In contrast, few studies have examined how more subtle valences are perceived in common objects (e.g., lamps, clocks, or coffee cups; McManus, 1980; Giner-Sorolla et al, 1999; Rentschler et al, 1999; Duckworth et al, 2002; Bar et al, 2006; Bar and Neta, 2007).…”
Section: Introductionmentioning
confidence: 99%