Deep networks provide a potentially rich interconnection between neuroscientific and artificial approaches to understanding visual intelligence, but the relationship between artificial and neural representations of complex visual form has not been elucidated at the level of single-unit selectivity. Taking the approach of an electrophysiologist to characterizing single CNN units, we found many units exhibit translation-invariant boundary curvature selectivity approaching that of exemplar neurons in the primate mid-level visual area V4. For some V4-like units, particularly in middle layers, the natural images that drove them best were qualitatively consistent with selectivity for object boundaries. Our results identify a novel image-computable model for V4 boundary curvature selectivity and suggest that such a representation may begin to emerge within an artificial network trained for image categorization, even though boundary information was not provided during training. This raises the possibility that single-unit selectivity in CNNs will become a guide for understanding sensory cortex.
Summary The current study demonstrates, for the first time, a specific enhancement of auditory spatial cue discrimination due to eye gaze. Whereas the region of sharpest visual acuity, called the fovea, can be directed at will by moving one's eyes, auditory spatial information is primarily derived from head-related acoustic cues. Past auditory studies have found better discrimination in front of the head [1–3], but have not manipulated subjects' gaze, thus overlooking potential oculomotor influences. Electrophysiological studies have shown that the inferior colliculus (IC), a critical auditory midbrain nucleus, shows visual and oculomotor responses [4–6] and modulations of auditory activity [7–9], and auditory neurons in the superior colliculus (SC) show shifting receptive fields [10–13]. How the auditory system leverages this crossmodal information at the behavioral level remains unknown. Here, we directed subjects' gaze (with an eccentric dot) or auditory attention (with lateralized noise) while they performed an auditory spatial cue discrimination task. We found that directing gaze toward a sound significantly enhances discrimination of both interaural level and time differences, whereas directing auditory spatial attention does not. These results show that oculomotor information variably enhances auditory spatial resolution even when the head remains stationary, revealing a distinct behavioral benefit possibly arising from auditory-oculomotor interactions at an earlier level of processing than previously demonstrated.
The correlation coefficient squared, r2, is commonly used to validate quantitative models on neural data, yet it is biased by trial-to-trial variability: as trial-to-trial variability increases, measured correlation to a model’s predictions decreases. As a result, models that perfectly explain neural tuning can appear to perform poorly. Many solutions to this problem have been proposed, but no consensus has been reached on which is the least biased estimator. Some currently used methods substantially overestimate model fit, and the utility of even the best performing methods is limited by the lack of confidence intervals and asymptotic analysis. We provide a new estimator, r ^ ER 2, that outperforms all prior estimators in our testing, and we provide confidence intervals and asymptotic guarantees. We apply our estimator to a variety of neural data to validate its utility. We find that neural noise is often so great that confidence intervals of the estimator cover the entire possible range of values ([0, 1]), preventing meaningful evaluation of the quality of a model’s predictions. This leads us to propose the use of the signal-to-noise ratio (SNR) as a quality metric for making quantitative comparisons across neural recordings. Analyzing a variety of neural data sets, we find that up to ∼ 40% of some state-of-the-art neural recordings do not pass even a liberal SNR criterion. Moving toward more reliable estimates of correlation, and quantitatively comparing quality across recording modalities and data sets, will be critical to accelerating progress in modeling biological phenomena.
The correlation coefficient squared, $r^2$, is often used to validate quantitative models on neural data. Yet it is biased by trial-to-trial variability: as trial-to-trial variability increases, measured correlation to a model's predictions decreases; therefore, models that perfectly explain neural tuning can appear to perform poorly. Many solutions to this problem have been proposed, but some prior methods overestimate model fit, the utility of even the best performing methods is limited by the lack of confidence intervals and asymptotic analysis, and no consensus has been reached on which is the least biased estimator. We provide a new estimator, $\hat{r}^2_{\text{ER}}$, that outperforms all prior estimators in our testing, and we provide confidence intervals and asymptotic guarantees. We apply our estimator to a variety of neural data to validate its utility. We find that neural noise is often so great that confidence intervals of the estimator cover the entire possible range of values ([0,1]), preventing meaningful evaluation of the quality of a model's predictions. We demonstrate the use of the signal-to-noise ratio (SNR) as a quality metric for making quantitative comparisons across neural recordings. Analyzing a variety of neural data sets, we find $\sim 40 \%$ or less of some neural recordings do not pass even a liberal SNR criterion.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.