Although we are confronted with an ever-changing environment, we do not have the capacity to analyze all incoming sensory information. Perception is selective and is guided both by salient events occurring in our visual field and by cognitive premises about what needs our attention. Although the lateral intraparietal area (LIP) and frontal eye field (FEF) are known to represent the position of visual attention, their respective contributions to its control are still unclear. Here, we report LIP and FEF neuronal activities recorded while monkeys performed a voluntary attention-orientation target-detection task. We show that both encode behaviorally significant events, but that the FEF plays a specific role in mapping abstract cue instructions onto a spatial priority map to voluntarily guide attention. On the basis of a latency analysis, we show that the coding of stimulus identity and position precedes the emergence of an explicit attentional signal within the FEF. We also describe dynamic temporal hierarchies between LIP and FEF: stimuli carrying the highest intrinsic saliency are signaled by LIP before FEF, whereas stimuli carrying the highest extrinsic saliency are signaled in FEF before LIP. This suggests that whereas the parietofrontal attentional network most probably processes visual information in a recurrent way, exogenous processing predominates in the parietal cortex and the endogenous control of attention takes place in the FEF.
Throughout the history of modern neuroscience, the parietal cortex has been associated with a wide array of sensory, motor, and cognitive functions. The use of non-human primates as a model organism has been instrumental in our current understanding of how areas in the posterior parietal cortex (PPC) modulate our perception and influence our behavior. In this Perspective, we highlight a series of influential studies over the last five decades examining the role of the PPC in visual perception and motor planning. We also integrate long-standing views of PPC functions with more recent evidence to propose a more general model framework to explain integrative sensory, motor, and cognitive functions of the PPC.
Summary The primate visual system consists of multiple hierarchically organized cortical areas, each specialized for processing distinct aspects of the visual scene. For example, color and form are encoded in ventral pathway areas such as V4 and inferior temporal cortex, while motion is preferentially processed in dorsal pathway areas such as the middle temporal area. Such representations often need to be integrated perceptually to solve tasks which depend on multiple features. We tested the hypothesis that the lateral intraparietal area (LIP) integrates disparate task-relevant visual features by recording from LIP neurons in monkeys trained to identify target stimuli composed of conjunctions of color and motion features. We show that LIP neurons exhibit integrative representations of both color and motion features when they are task relevant, and task-dependent shifts of both direction and color tuning. This suggests that LIP plays a role in flexibly integrating task-relevant sensory signals.
Summary Lateral intraparietal (LIP) neurons encode a vast array of sensory and cognitive variables. Recently, we proposed that the flexibility of feature representations in LIP reflect the bottom-up integration of sensory signals, modulated by feature-based attention (FBA), from upstream feature-selective cortical neurons. Moreover, LIP activity is also strongly modulated by the position of space-based attention (SBA). However, the mechanisms by which SBA and FBA interact to facilitate the representation of task-relevant spatial and non-spatial features in LIP remain unclear. We recorded from LIP neurons during performance of a task which required monkeys to detect specific conjunctions of color, motion-direction, and stimulus position. Here we show that FBA and SBA potentiate each other’s effect in a manner consistent with attention gating the flow of visual information along the cortical visual pathway. Our results suggest that linear bottom-up integrative mechanisms allow LIP neurons to emphasize task-relevant spatial and non-spatial features.
Despite an ever growing knowledge on how parietal and prefrontal neurons encode low-level spatial and color information or higher-level information, such as spatial attention, an understanding of how these cortical regions process neuronal information at the population level is still missing. A simple assumption would be that the function and temporal response profiles of these neuronal populations match that of its constituting individual cells. However, several recent studies suggest that this is not necessarily the case and that the single-cell approach overlooks dynamic changes in how information is distributed over the neuronal population. Here, we use a time-resolved population pattern analysis to explore how spatial position, spatial attention and color information are differentially encoded and maintained in the macaque monkey prefrontal (frontal eye fields) and parietal cortex (lateral intraparietal area). Overall, our work brings about three novel observations. First, we show that parietal and prefrontal populations operate in two distinct population regimens for the encoding of sensory and cognitive information: a stationary mode and a dynamic mode. Second, we show that the temporal dynamics of a heterogeneous neuronal population brings about complementary information to that of its functional subpopulations. Thus, both need to be investigated in parallel. Last, we show that identifying the neuronal configuration in which a neuronal population encodes given information can serve to reveal this same information in a different context. All together, this work challenges common views on neural coding in the parietofrontal network.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.