Social contact often initially depends on ascertaining the direction of the other person's gaze. We determined the brain areas involved in gaze monitoring by a functional neuroimaging study. Discrimination between the direction of gaze significantly activated a region in the left amygdala during eye-contact and no eye-contact tasks to the same extent. However, a region in the right amygdala was specifically activated only during the eye-contact task. Results confirm that the left amygdala plays a general role in the interpretation of eye gaze direction, and that the activity of the right amygdala of the subject increases when another individual's gaze is directed towards him. This suggests that the human amygdala plays a role in reading social signals from the face.
By measuring regional cerebral blood flow using PET, we delineated the roles of the occipito-temporal regions activated by faces and scenes. We asked right-handed normal subjects to perform three tasks using facial images as visual stimuli: in the face familiar/unfamiliar discrimination (FF) task, they discriminated the faces of their friends and associates from unfamiliar ones; in the face direction discrimination (FD) task, they discriminated the direction of each unfamiliar face; in the dot location discrimination (DL) task, they discriminated the location of a red dot on a scrambled face. The activity in each task was compared with that in the control fixation (CF) task, in which they fixated on the centre of a display without visual stimuli. The DL task activated the occipital cortices and posterior fusiform gyri bilaterally. During the FD task, the activation extended anteriorly in the right fusiform gyrus and laterally to the right inferior temporal cortex. The FF task further activated the right temporal pole. To examine whether the activation due to faces was face-specific, we used a scene familiar/unfamiliar discrimination (SF) task, in which the subjects discriminated familiar scenes from unfamiliar ones. Our results suggest that (i) the occipital cortices and posterior fusiform gyri non-selectively respond to faces, scrambled faces and scenes, and are involved mainly in the extraction of physical features of complex visual images; (ii) the right inferior temporal/fusiform gyrus responds selectively to faces but not to non-face stimuli and is involved in the visual processing related to face perception, whereas the bilateral parahippocampal gyri and parieto-occipital junctions respond selectively to scenes and are involved in processing related to scene perception; and (iii) the right temporal pole is activated during the discrimination of familiar faces and scenes from unfamiliar ones, and is probably involved in the recognition of familiar objects.
We measured regional cerebral blood flow (rCBF) using positron emission tomography (PET) to determine which brain regions are involved in the assessment of facial emotion. We asked right-handed normal subjects to assess the signalers' emotional state based on facial gestures and to assess the facial attractiveness, as well as to discriminate the background color of the facial stimuli, and compared the activity produced by each condition. The right inferior frontal cortex showed significant activation during the assessment of facial emotion in comparison with the other two tests. The activated area was located within a triangular area of the inferior frontal cortex in the right cerebral hemisphere. These results, together with those of previous imaging and clinical studies, suggest that the right inferior frontal cortex processes emotional communicative signals that could be visual or auditory and that there is a hemispheric asymmetry in the inferior frontal cortex in relation to the processing of emotional communicative signals.
Regional cerebral blood flow was measured in six healthy volunteers by positron emission tomography during identification of speaker and emotion from spoken words. The speaker identification task activated several audio-visual multimodal areas, particularly the temporal poles in both hemispheres, which may be involved in connecting vocal attributes with the visual representations of speakers. The emotion identification task activated regions in the cerebellum and the frontal lobe, suggesting a functional relationship between those regions involved in emotion. The results suggest that different anatomical structures contribute to the vocal identification of speaker and emotion.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.