Modulation of early perceptual processing by emotional expression and the affective valence of faces was explored in an event-related potential (ERP) study. An associative procedure was used where different neutral faces changed to happy, to angry or, in a control condition, stayed the same. Based on these changes in expression, participants had then to identify each neutral face as belonging to a friendly, hostile, or neutral individual. ERP measures revealed modulations at occipital-temporal sites of the P100 and N170 components by both the emotional expression and the valence of the associated neutral faces. The early posterior negativity (EPN) component, however, was only sensitive to emotional expression. These results are consistent with previous findings showing that emotional expression influences face perception since early stages of visual processing and provide new evidence that this influence can also be transferred to neutral faces through associative learning.
Current accounts of spatial cognition and human-object interaction suggest that the representation of peripersonal space depends on an action-specific system that remaps its representation according to action requirements. Here we demonstrate that this mechanism is sensitive to knowledge about properties of objects. In two experiments we explored the interaction between physical distance and object attributes (functionality, desirability, graspability, etc.) through a reaching estimation task in which participants indicated if objects were near enough to be reached. Using both a real and a cutting-edge digital scenario, we demonstrate that perceived reaching distance is influenced by ease of grasp and the affective valence of an object. Objects with a positive affective valence tend to be perceived reachable at locations at which neutral or negative objects are perceived as non-reachable. In addition to this, reaction time to distant (non-reachable) positive objects suggests a bias to perceive positive objects as closer than negative and neutral objects (exp. 2). These results highlight the importance of the affective valence of objects in the action-specific mapping of the peripersonal/extrapersonal space system.
When looking at static visual images, people often exhibit mental animation, anticipating visual events that have not yet happened. But what determines when mental animation occurs? Measuring mental animation using localized brain function (visual motion processing in the middle temporal and middle superior temporal areas, MT+), we demonstrated that animating static pictures of objects is dependent both on the functionally relevant spatial arrangement that objects have with one another (e.g., a bottle above a glass vs. a glass above a bottle) and on the linguistic judgment to be made about those objects (e.g., “Is the bottle above the glass?” vs. “Is the bottle bigger than the glass?”). Furthermore, we showed that mental animation is driven by functional relations and language separately in the right hemisphere of the brain but conjointly in the left hemisphere. Mental animation is not a unitary construct; the predictions humans make about the visual world are driven flexibly, with hemispheric asymmetry in the routes to MT+ activation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.