We aimed at verifying the hypothesis that facial mimicry is causally and selectively involved in emotion recognition. For this purpose, in Experiment 1, we explored the effect of tonic contraction of muscles in upper or lower half of participants' face on their ability to recognize emotional facial expressions. We found that the "lower" manipulation specifically impaired recognition of happiness and disgust, the "upper" manipulation impaired recognition of anger, while both manipulations affected recognition of fear; recognition of surprise and sadness were not affected by either blocking manipulations. In Experiment 2, we verified whether emotion recognition is hampered by stimuli in which an upper or lower half-face showing an emotional expression is combined with a neutral half-face. We found that the neutral lower half-face interfered with recognition of happiness and disgust, whereas the neutral upper half impaired recognition of anger; recognition of fear and sadness was impaired by both manipulations, whereas recognition of surprise was not affected by either manipulation. Taken together, the present findings support simulation models of emotion recognition and provide insight into the role of mimicry in comprehension of others' emotional facial expressions.
Background: We tested whether the tendency to worry could affect psychological responses to quarantine by capitalizing on the opportunity of having collected data before the COVID-19 outbreak on measures of worry, anxiety, and trait mindfulness in a group of university students. Methods: Twenty-five participants completed self-report measures assessing worry (Penn State Worry Questionnaire, PSWQ), anxiety (Anxiety Sensitivity Index, ASI-3), and trait mindfulness (Mindful Attention Awareness Scale, MAAS) at T0 (pre-lockdown, 4 November 2019–17 February 2020) and T1 (at the end of lockdown, 26 April–30 April 2020). We compared assessments at the two time points in the whole sample and in high and low worriers (defined at T0 by scores on PSWQ respectively above and below 1.5 SD from mean of the Italian normative sample). Outcomes: High worriers showed at T1 a significant increase of anxiety sensitivity and fear of mental health in comparison to low worriers. Moreover, in the whole sample, at T1 trait mindfulness was inversely related to worry and fear of mental health. Interpretation: A valuable approach to support individuals experiencing anxiety related to the COVID-19 outbreak could be represented by mindfulness-based interventions improving the ability to focus attention and awareness on the present moment.
Embodied cognition theories hold that cognitive processes are grounded in bodily states. Embodied processes in autism spectrum disorders (ASD) have classically been investigated in studies on imitation. Several observations suggested that unlike typical individuals who are able of copying the model's actions from the model's position, individuals with ASD tend to reenact the model's actions from their own egocentric perspective. Here, we performed two behavioral experiments to directly test the ability of ASD individuals to adopt another person's point of view. In Experiment 1, participants had to explicitly judge the left/right location of a target object in a scene from their own or the actor's point of view (visual perspective taking task). In Experiment 2, participants had to perform left/right judgments on front-facing or back-facing human body images (own body transformation task). Both tasks can be solved by mentally simulating one's own body motion to imagine oneself transforming into the position of another person (embodied simulation strategy), or by resorting to visual/spatial processes, such as mental object rotation (nonembodied strategy). Results of both experiments showed that individual with ASD solved the tasks mainly relying on a nonembodied strategy, whereas typical controls adopted an embodied strategy. Moreover, in the visual perspective taking task ASD participants had more difficulties than controls in inhibiting other-perspective when directed to keep one's own point of view. These findings suggested that, in social cognitive tasks, individuals with ASD do not resort to embodied simulation and have difficulties in cognitive control over self- and other-perspective.
In the present paper, we investigated whether observation of bodily cues-that is, hand action and eye gaze-can modulate the onlooker's visual perspective taking. Participants were presented with scenes of an actor gazing at an object (or straight ahead) and grasping an object (or not) in a 2 × 2 factorial design and a control condition with no actor in the scene. In Experiment 1, two groups of subjects were explicitly required to judge the left/right location of the target from their own (egocentric group) or the actor's (allocentric group) point of view, whereas in Experiment 2 participants did not receive any instruction on the point of view to assume. In both experiments, allocentric coding (i.e., the actor's point of view) was triggered when the actor grasped the target, but not when he gazed towards it, or when he adopted a neutral posture. In Experiment 3, we demonstrate that the actor's gaze but not action affected participants' attention orienting. The different effects of others' grasping and eye gaze on observers' behaviour demonstrated that specific bodily cues convey distinctive information about other people's intentions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.