The binocular disparity between the views of the world registered by the left and right eyes provides a powerful signal about the depth structure of the environment. Despite increasing knowledge of the cortical areas that process disparity from animal models, comparatively little is known about the local architecture of stereoscopic processing in the human brain. Here, we take advantage of the high spatial specificity and image contrast offered by 7 tesla fMRI to test for systematic organization of disparity representations in the human brain. Participants viewed random dot stereogram stimuli depicting different depth positions while we recorded fMRI responses from dorsomedial visual cortex. We repeated measurements across three separate imaging sessions. Using a series of computational modeling approaches, we report three main advances in understanding disparity organization in the human brain. First, we show that disparity preferences are clustered and that this organization persists across imaging sessions, particularly in area V3A. Second, we observe differences between the local distribution of voxel responses in early and dorsomedial visual areas, suggesting different cortical organization. Third, using modeling of voxel responses, we show that higher dorsal areas (V3A, V3B/KO) have properties that are characteristic of human depth judgments: a simple model that uses tuning parameters estimated from fMRI data captures known variations in human psychophysical performance. Together, these findings indicate that human dorsal visual cortex contains selective cortical structures for disparity that may support the neural computations that underlie depth perception.
Purpose: To study the correspondence of anatomically and functionally defined visual areas (primary visual cortex, V1, and motion selective area V5/human MTþ) by using structural magnetic resonance imaging (MRI) and functional MRI (fMRI) in vivo at 7 T. Materials and Methods:Four subjects participated in this study. High-resolution (%0.4 mm isotropic) anatomical MRI was used to identify cortical regions based on their distinct cortical lamination. The optimal contrast for identifying heavily myelinated layers within gray matter was quantitatively assessed by comparing T 1 -weighted magnetization-prepared rapid gradient echo (MPRAGE) and T 2 *-weighted, 3D fast-low angle shot (FLASH) imaging. Retinotopic mapping was performed using GE-based fMRI at 1.5 mm isotropic resolution to identify functional areas.Results: T 2 *-weighted FLASH imaging was found to provide a significantly higher contrast-to-noise ratio, allowing visualization of the stria of Gennari in every slice of a volume covering the occipital cortex in each of the four subjects in this study. The independently derived boundary of V1, identified in the same subjects using retinotopic mapping by fMRI, closely matched the border of anatomically defined striate cortex in the human brain. Evidence of banding was also found within the functionally defined V5 area; however, we did not find a good correlation of this area, or the functionally identified subregion (MT), with the banded area. Conclusion:High-resolution T 2 *-weighted images acquired at 7 T can be used to identify myelinated bands within cortical gray matter in reasonable measurement times. Regions where a myelinated band was identified show a high degree of overlap with the functionally defined V1 area.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.