2014
DOI: 10.1080/02699931.2014.933735
|View full text |Cite
|
Sign up to set email alerts
|

Only irrelevant sad but not happy faces are inhibited under high perceptual load

Abstract: Perceptual load plays a critical role in identification and awareness of stimuli. Given the differences in emotion-attention interactions, we investigated the perception of distractor emotional faces in two different load conditions under divided attention with a task based on the inattentional blindness paradigm. Participants performed a low- or high-load task with a string of letters presented against a happy, sad or neutral face (in a circular form) as the background. Participants were asked to identify the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

13
45
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 45 publications
(58 citation statements)
references
References 34 publications
13
45
0
Order By: Relevance
“…None of the faces used in the practice trials were used in the experiment trials. The main experiment contained 384 trials, 192 for each perceptual load (equally distributed by emotion type) (see Gupta and Srinivasan, 2015;Gupta et al, 2016;Soares et al, 2015;Wiens and Syrjänen, 2013; for similar procedures). Response times (RT) and accuracy were analysed.…”
Section: Methodsmentioning
confidence: 99%
“…None of the faces used in the practice trials were used in the experiment trials. The main experiment contained 384 trials, 192 for each perceptual load (equally distributed by emotion type) (see Gupta and Srinivasan, 2015;Gupta et al, 2016;Soares et al, 2015;Wiens and Syrjänen, 2013; for similar procedures). Response times (RT) and accuracy were analysed.…”
Section: Methodsmentioning
confidence: 99%
“…As is known, emotional stimuli always receive priority attention as compared to non-emotional stimuli (Batty and Taylor, 2003;Vimal, 2008;Hodsoll et al, 2011;Barratt and Bundesen, 2012;Ikeda et al, 2013;Schmidt et al, 2015;Pool et al, 2016;Glickman and Lamy, 2017), with the exception of instances with a high percepetion load (Yates et al, 2010;Gupta and Srinivasan, 2015). In addition, early attention bias for negative emotion (such as fearful faces), in which people can quickly detect negative and threatening stimuli, has returned relatively consistent empirical results (Hansen and Hansen, 1988;Luo et al, 2010;Pinkham et al, 2010).…”
Section: Introductionmentioning
confidence: 92%
“…Recent studies on the relationship between emotion and attention in young adults have revealed that both positive and negative stimuli are automatically processed and rapidly capture attention [20][21][22][23]. Accumulating evidence suggests that the attention-capturing power of positive stimuli may be even stronger than that of negative stimuli when emotional stimuli are presented as distractors while participants engage in tasks that demand attention [22,24,25]. For instance, using positive and negative faces as distractors, Gupta et al [22] found that only positive distractors compromised participant performance in a highload letter-search task, although both positive and negative faces distracted participants under lowload conditions.…”
Section: Introductionmentioning
confidence: 99%
“…For instance, using positive and negative faces as distractors, Gupta et al [22] found that only positive distractors compromised participant performance in a highload letter-search task, although both positive and negative faces distracted participants under lowload conditions. This is explained by the nature of positive stimuli, which readily capture attention and are difficult to ignore even in resource-demanding situations, because the attentional resources required to note positive stimuli are less than those required to recognize negative stimuli [22,25]. This is supported by various studies that have explored the processing of positive stimuli using various methods [24,26,27].…”
Section: Introductionmentioning
confidence: 99%