2019
DOI: 10.1037/emo0000477
|View full text |Cite
|
Sign up to set email alerts
|

Remembering: Does the emotional content of a photograph affect boundary extension?

Abstract: Observers falsely remember seeing beyond the bounds of a photograph (i.e., boundary extension [BE]). Do observers "zoom in" when viewing negative emotion photographs, resulting in boundary restriction (Safer, Christianson, Autry, & Österlund, 1998)? Studies have yielded inconsistent outcomes, perhaps because emotional valence was compared across photographs of completely different scenes. To control physical scene structure, two contrasting (negative vs. positive) emotional versions of the same scenes were cre… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
14
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 11 publications
(14 citation statements)
references
References 75 publications
(171 reference statements)
0
14
0
Order By: Relevance
“…The scene-based mechanism is sensitive to global scene properties, such as concavity, navigability, openness and mean depth (Bonner & Epstein, 2017;Cheng et al, 2019;Greene & Oliva, 2009;Park & Park, 2020), and is biased to construct a broader view, perhaps via the amodal scene construction originally proposed by Intraub (2012). At the same time, the objectbased mechanism exerts pressure to process objects in more detail or with higher fidelity, perhaps via object-based attention (Beighley et al, 2019), resulting in down-weighting or loss of peripheral information.…”
Section: Discussionmentioning
confidence: 99%
“…The scene-based mechanism is sensitive to global scene properties, such as concavity, navigability, openness and mean depth (Bonner & Epstein, 2017;Cheng et al, 2019;Greene & Oliva, 2009;Park & Park, 2020), and is biased to construct a broader view, perhaps via the amodal scene construction originally proposed by Intraub (2012). At the same time, the objectbased mechanism exerts pressure to process objects in more detail or with higher fidelity, perhaps via object-based attention (Beighley et al, 2019), resulting in down-weighting or loss of peripheral information.…”
Section: Discussionmentioning
confidence: 99%
“…We instead used an external manipulation of arousal along with neutral images rather than relying on negative images to elicit arousal. This procedure meant other aspects of the image were consistent (e.g., content, context, composition), and is thus a more methodologically sound way of manipulating the arousal of an image (Porter et al, 2014; also see Beighley et al, 2018). Indeed, noise consistently elicits an arousal response similar to fear across individuals (Rhudy & Meagher, 2001).…”
Section: Discussionmentioning
confidence: 98%
“…Participants were correct more than half the time in recognizing no change to the images. 8 We then calculated the mean reported camera distance for each participant on the −2 to +2 scale across all images (M = −0.09 SD = 0.18), and conducted an exploratory one-samples t test on this variable (Beighley et al, 2018;Intraub et al, 1996;Takarangi et al, 2015). We found that the mean camera distance was significantly less than zero (where zero indicates no error), t(95) = −5.00, p < .001, d = 0.51, 95% CI [0.06, 0.13].…”
Section: Analysis Of Boundary Errorsmentioning
confidence: 99%
See 2 more Smart Citations