Current knowledge regarding the processing of observed manipulative actions (OMAs) (e.g., grasping, dragging, or dropping) is limited to grasping and underlying neural circuitry remains controversial. Here, we addressed these issues by combining chronic neuronal recordings along the anteroposterior extent of monkeys’ anterior intraparietal (AIP) area with tracer injections into the recorded sites. We found robust neural selectivity for 7 distinct OMAs, particularly in the posterior part of AIP (pAIP), where it was associated with motor coding of grip type and own-hand visual feedback. This cluster of functional properties appears to be specifically grounded in stronger direct connections of pAIP with the temporal regions of the ventral visual stream and the prefrontal cortex, as connections with skeletomotor related areas and regions of the dorsal visual stream exhibited opposite or no rostrocaudal gradients. Temporal and prefrontal areas may provide visual and contextual information relevant for manipulative action processing. These results revise existing models of the action observation network, suggesting that pAIP constitutes a parietal hub for routing information about OMA identity to the other nodes of the network.
With the novel probes it is possible to record stable biologically relevant data over a time span exceeding the usual time needed for epileptic focus localisation in human patients. This is the first time that single units are recorded along cylindrical polyimide probes chronically implanted 22 mm deep into the brain of a monkey, which suggests the potential usefulness of this probe for human applications.
Grasping relies on a network of parieto-frontal areas lying on the dorsolateral and dorsomedial parts of the hemispheres. However, the initiation and sequencing of voluntary actions also requires the contribution of mesial premotor regions, particularly the pre-supplementary motor area F6. We recorded 233 F6 neurons from 2 monkeys with chronic linear multishank neural probes during reaching–grasping visuomotor tasks. We showed that F6 neurons play a role in the control of forelimb movements and some of them (26%) exhibit visual and/or motor specificity for the target object. Interestingly, area F6 neurons form 2 functionally distinct populations, showing either visually-triggered or movement-related bursts of activity, in contrast to the sustained visual-to-motor activity displayed by ventral premotor area F5 neurons recorded in the same animals and with the same task during previous studies. These findings suggest that F6 plays a role in object grasping and extend existing models of the cortical grasping network.
SignificanceSocial animals exploit information about objects for planning actions and for predicting those of others. Here, we show that pre-supplementary motor area F6 hosts different types of neurons responding to visually presented objects when they are targeted by the monkey’s own action (self-type), another agent’s action (other-type), or both (self- and other-type). These findings suggest the existence in area F6 of an “object-mirroring” mechanism, which allows observers to predict others’ impending action by recruiting the same motor representation they would use for planning their own action in the same context, before the activation of classical “action-mirroring” mechanisms.
Others' observed actions cause continuously changing retinal images, making it challenging to build neural representations of action identity. The monkey anterior intraparietal area (AIP) and its putative human homologue (phAIP) host neurons selective for observed manipulative actions (OMAs). The neuronal activity of both AIP and phAIP allows a stable readout of OMA identity across visual formats, but human neurons exhibit greater invariance and generalize from observed actions to action verbs. These properties stem from the convergence in AIP of superior temporal signals concerning: (i) observed body movements; and (ii) the changes in the body-object relationship. We propose that evolutionarily preserved mechanisms underlie the specification of observed-actions identity and the selection of motor responses afforded by them, thereby promoting social behavior. Combining Observed Body Movements and Objects Changes: The Action's IdentityManual skills are a hallmark of primates, particularly humans. They have made possible most of our transformational impact on the world, which was driven by an evolutionarily preserved but expanding network of cortical areas in the primate lineage that subserves the neural control of manipulative actions [1][2][3][4]. Interestingly, an equally well-articulated neural machinery is required to resolve the visual complexity of observed manipulative actions (OMAs) (see Glossary) performed by other individuals, because this ability is of critical importance for action planning during social interaction and interindividual coordination [5][6][7]. Indeed, as compared with other complex static visual stimuli, such as objects [8], faces [9,10], others' gaze direction [11], and body posture [12], observed actions of others are inherently dynamic stimuli, and their dynamics are essential for an observer's brain to compute their identity, despite the rapid changes in their retinal image. This is probably the reason why James Gibson claimed that 'animals are by far the most complex objects of perception that the environment presents to an observer' [13].
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.