Fantoni & Gerbino (2014) showed that subtle postural shifts associated with reaching can have a strong hedonic impact and affect how actors experience facial expressions of emotion. Using a novel Motor Action Mood Induction Procedure (MAMIP), they found consistent congruency effects in participants who performed a facial emotion identification task after a sequence of visually-guided reaches: a face perceived as neutral in a baseline condition appeared slightly happy after comfortable actions and slightly angry after uncomfortable actions. However, skeptics about the penetrability of perception (Zeimbekis & Raftopoulos, 2015) would consider such evidence insufficient to demonstrate that observer’s internal states induced by action comfort/discomfort affect perception in a top-down fashion. The action-modulated mood might have produced a back-end memory effect capable of affecting post-perceptual and decision processing, but not front-end perception.Here, we present evidence that performing a facial emotion detection (not identification) task after MAMIP exhibits systematic mood-congruent sensitivity changes, rather than response bias changes attributable to cognitive set shifts; i.e., we show that observer’s internal states induced by bodily action can modulate affective perception. The detection threshold for happiness was lower after fifty comfortable than uncomfortable reaches; while the detection threshold for anger was lower after fifty uncomfortable than comfortable reaches. Action valence induced an overall sensitivity improvement in detecting subtle variations of congruent facial expressions (happiness after positive comfortable actions, anger after negative uncomfortable actions), in the absence of significant response bias shifts. Notably, both comfortable and uncomfortable reaches impact sensitivity in an approximately symmetric way relative to a baseline inaction condition. All of these constitute compelling evidence of a genuine top-down effect on perception: specifically, facial expressions of emotion are penetrable by action-induced mood. Affective priming by action valence is a candidate mechanism for the influence of observer’s internal states on properties experienced as phenomenally objective and yet loaded with meaning.
Recent findings on emotion comparison show a typical pattern of motor reactivity rising from attentional capture. When pairs of emotional faces are presented simultaneously, the most intense emotional face is recognized faster (Emotional Semantic Congruency—ESC effect). Furthermore, a global response speed advantage for emotional pairs with positive rather than negative average emotion intensity is observed (i.e., emotional size effect), with the choice for the happiest face resulting in a faster response than the choice for the angriest face within the pair (i.e., the happiness advantage). In two experiments, we asked whether these effects are orientation dependent, and thus linked to whether face processing is holistic or part-based. Participants were asked to choose the angriest/happiest face in emotional pairs displayed either in upright or inverted orientation and including (Experiment 1) or not including (Experiment 2) a neutral face. Beyond an overall facilitation for upright relative to inverted pairs, results showed orientation independent ESC and emotional size effects. Furthermore, the happiness advantage was present in emotional pairs of Experiment 2 but not in emotional pairs of Experiment 1, independently from face orientation. Together, results suggest that attentional capture in emotion comparison is immaterial on the type of face processing, being orientation invariant.
Ample evidence attests that social intention, elicited through gestures explicitly signaling a request of communicative intention, affects the patterning of hand movement kinematics. The current study goes beyond the effect of social intention and addresses whether the same action of reaching to grasp an object for placing it in an end target position within or without a monitoring attendee’s peripersonal space, can be moulded by pure social factors in general, and by social facilitation in particular. A motion tracking system (Optotrak Certus) was used to record motor acts. We carefully avoided the usage of communicative intention by keeping constant both the visual information and the positional uncertainty of the end target position, while we systematically varied the social status of the attendee (a high, or a low social status) in separated blocks. Only thirty acts performed in the presence of a different social status attendee, revealed a significant change of kinematic parameterization of hand movement, independently of the attendee's distance. The amplitude of peak velocity reached by the hand during the reach-to-grasp and the lift-to-place phase of the movement was larger in the high rather than in the low social status condition. By contrast, the deceleration time of the reach-to-grasp phase and the maximum grasp aperture was smaller in the high rather than in the low social status condition. These results indicated that the hand movement was faster but less carefully shaped in presence of a high, but not of a low social status attendee. This kinematic patterning suggests that being monitored by a high rather than a low social status attendee might lead participants to experience evaluation apprehension that informs the control of motor execution. Motor execution would rely more on feedforward motor control in the presence of a high social status human attendee, vs. feedback motor control, in the presence of a low social status attendee.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.