The visual recognition of complex movements and actions is crucial for the survival of many species. It is important not only for communication and recognition at a distance, but also for the learning of complex motor actions by imitation. Movement recognition has been studied in psychophysical, neurophysiological and imaging experiments, and several cortical areas involved in it have been identified. We use a neurophysiologically plausible and quantitative model as a tool for organizing and making sense of the experimental data, despite their growing size and complexity. We review the main experimental findings and discuss possible neural mechanisms, and show that a learning-based, feedforward model provides a neurophysiologically plausible and consistent summary of many key experimental results.
The rich and immediate perception of a familiar face, including its identity, expression and even intent, is one of the most impressive shared faculties of human and non-human primate brains. Many visually responsive neurons in the inferotemporal cortex of macaque monkeys respond selectively to faces, sometimes to only one or a few individuals, while showing little sensitivity to scale and other details of the retinal image. Here we show that face-responsive neurons in the macaque monkey anterior inferotemporal cortex are tuned to a fundamental dimension of face perception. Using a norm-based caricaturization framework previously developed for human psychophysics, we varied the identity information present in photo-realistic human faces, and found that neurons of the anterior inferotemporal cortex were most often tuned around the average, identity-ambiguous face. These observations are consistent with face-selective responses in this area being shaped by a figural comparison, reflecting structural differences between an incoming face and an internal reference or norm. As such, these findings link the tuning of neurons in the inferotemporal cortex to psychological models of face identity perception.
Human observers readily recognize emotions expressed in body movement. Their perceptual judgments are based on simple movement features, such as overall speed, but also on more intricate posture and dynamic cues. The systematic analysis of such features is complicated due to the difficulty of considering the large number of potentially relevant kinematic and dynamic parameters. To identify emotion-specific features we motion-captured the neutral and emotionally expressive (anger, happiness, sadness, fear) gaits of 25 individuals. Body posture was characterized by average flexion angles, and a low-dimensional parameterization of the spatio-temporal structure of joint trajectories was obtained by approximation with a nonlinear mixture model. Applying sparse regression, we extracted critical emotion-specific posture and movement features, which typically depended only on a small number of joints. The features we extracted from the motor behavior closely resembled features that were critical for the perception of emotion from gait, determined by a statistical analysis of classification and rating judgments of 21 observers presented with avatars animated with the recorded movements. The perceptual relevance of these features was further supported by another experiment showing that artificial walkers containing only the critical features induced high-level after-effects matching those induced by adaptation with natural emotional walkers.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.