Summary Jumping spiders (Salticidae) are famous for their visually driven behaviors [1]. Here, however, we present behavioral and neurophysiological evidence that these animals also perceive and respond to airborne acoustic stimuli, even when the distance between the animal and the sound source is relatively large (~3 m) and with stimulus amplitudes at the position of the spider of ~65 dB SPL. Behavioral experiments with the jumping spider Phidippus audax reveal that these animals respond to low frequency sounds (80 Hz; 65 dB SPL) by freezing—a common anti-predatory behavior characteristic of an acoustic startle response. Neurophysiological recordings from auditory-sensitive neural units in the brains of these jumping spiders showed responses to low-frequency tones (80 Hz at ~65 dB SPL); recordings that also represent the first record of acoustically-responsive neural units in the jumping spider brain. Responses persisted even when the distances between spider and stimulus source exceeded 3 m and under anechoic conditions. Thus, these spiders appear able to detect airborne sound at distances in the acoustic far-field region, beyond the near-field range often thought to bound acoustic perception in arthropods that lack tympanic ears (e.g. spiders) [2]. Further, direct mechanical stimulation of hairs on the patella of the foreleg was sufficient to generate responses in neural units that also responded to airborne acoustic stimuli—evidence that these hairs likely play a role in the detection of acoustic cues. We suggest that these auditory responses enable the detection of predators and facilitate an acoustic startle response.
Edges in natural scenes can result from a number of different causes. In this study, we investigated the statistical differences between edges arising from occlusions and nonocclusions (reflectance differences, surface change, and cast shadows). In the first experiment, edges in natural scenes were identified using the Canny edge detection algorithm. Observers then classified these edges as either an occlusion edge (one region of an image occluding another) or a nonocclusion edge. The nonocclusion edges were further subclassified as due to a reflectance difference, a surface change, or a cast shadow. We found that edges were equally likely to be classified as occlusion or nonocclusion edges. Of the nonocclusion edges, approximately 33% were classified as reflectance changes, 9% as cast shadows, and 58% as surface changes. We also analyzed local statistical properties like contrast, average edge profile, and slope of the edges. We found significant differences between the contrast values for each category. Based on the local contrast statistics, we developed a maximum likelihood classifier to label occlusion and nonocclusion edges. An 80%-20% cross validation demonstrated that the human classification could be predicted with 83% accuracy. Overall, our results suggest that for many edges in natural scenes, there exists local statistical information regarding the cause of the edge. We believe that this information can potentially be used by the early visual system to begin the process of segregating objects from their backgrounds.
Jumping spiders (Salticidae) are renowned for a behavioral repertoire that can seem more vertebrate, or even mammalian, than spider-like in character. This is made possible by a unique visual system that supports their stalking hunting style and elaborate mating rituals in which the bizarrely marked and colored appendages of males highlight their song-and-dance displays. Salticids perform these tasks with information from four pairs of functionally specialized eyes, providing a near 360° field of view and forward-looking spatial resolution surpassing that of all insects and even some mammals, processed by a brain roughly the size of a poppy seed. Salticid behavior, evolution, and ecology are well documented, but attempts to study the neurophysiological basis of their behavior had been thwarted by the pressurized nature of their internal body fluids, making typical physiological techniques infeasible and restricting all previous neural work in salticids to a few recordings from the eyes. We report the first survey of neurophysiological recordings from the brain of a jumping spider, Phidippus audax (Salticidae). The data include single-unit recordings in response to artificial and naturalistic visual stimuli. The salticid visual system is unique in that high-acuity and motion vision are processed by different pairs of eyes. We found nonlinear interactions between the principal and secondary eyes, which can be inferred from the emergence of spatiotemporal receptive fields. Ecologically relevant images, including prey-like objects such as flies, elicited bursts of excitation from single units.
From the earliest stages of sensory processing, neurons show inherent non-linearities: the response to a complex stimulus is not a sum of the responses to a set of constituent basis stimuli. These non-linearities come in a number of forms and have been explained in terms of a number of functional goals. The family of spatial non-linearities have included interactions that occur both within and outside of the classical receptive field. They include, saturation, cross orientation inhibition, contrast normalization, end-stopping and a variety of non-classical effects. In addition, neurons show a number of facilitatory and invariance related effects such as those exhibited by complex cells (integration across position). Here, we describe an approach that attempts to explain many of the non-linearities under a single geometric framework. In line with Zetzsche and colleagues (e.g., Zetzsche et al., 1999) we propose that many of the principal non-linearities can be described by a geometry where the neural response space has a simple curvature. In this paper, we focus on the geometry that produces both increased selectivity (curving outward) and increased tolerance (curving inward). We demonstrate that overcomplete sparse coding with both low-dimensional synthetic data and high-dimensional natural scene data can result in curvature that is responsible for a variety of different known non-classical effects including end-stopping and gain control. We believe that this approach provides a more fundamental explanation of these non-linearities and does not require that one postulate a variety of explanations (e.g., that gain must be controlled or the ends of lines must be detected). In its standard form, sparse coding does not however, produce invariance/tolerance represented by inward curvature. We speculate on some of the requirements needed to produce such curvature.
The nature of artificial vision with a retinal prosthesis, and the degree to which the brain can adapt to the unnatural input from such a device, are poorly understood. Therefore, the development of current and future devices may be aided by theory and simulations that help to infer and understand what patients see. A novel computational framework was developed to predict visual perception and the effect of learning with a subretinal prosthesis. The framework is based on the idea that the central visual system efficiently reconstructs the incident image from the retinal output. To implement this idea, a simulation of the normal responses of the major retinal ganglion cell types was used to deduce the optimal linear reconstruction of the visual stimulus from retinal activity. The result was then used to make inferences about visual experience with simulated retinal activation by a subretinal prosthesis. The inferred visual perception obtained with prosthesis activation was substantially degraded compared to the inferred perception obtained with normal retinal responses, as expected given the limited resolution and lack of cell type specificity of the prosthesis. Consistent with the importance of cell type specificity, reconstruction using only ON cells, and not OFF cells, was substantially more accurate. Finally, when reconstruction was re-optimized for electrical stimulation, simulating learning by the patient, the accuracy of inferred perception with prosthesis stimulation was closer to that of natural vision. The reconstruction approach provides a framework for interpreting patient data in clinical trials, and may be useful for improving prosthesis design.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.