Sensing is often implicitly assumed to be the passive acquisition of information. However, part of the sensory information is generated actively when animals move. For instance, humans shift their gaze actively in a sequence of saccades towards interesting locations in a scene. Likewise, many insects shift their gaze by saccadic turns of body and head, keeping their gaze fixed between saccades. Here we employ a novel panoramic virtual reality stimulator and show that motion computation in a blowfly visual interneuron is tuned to make efficient use of the characteristic dynamics of retinal image flow. The neuron is able to extract information about the spatial layout of the environment by utilizing intervals of stable vision resulting from the saccadic viewing strategy. The extraction is possible because the retinal image flow evoked by translation, containing information about object distances, is confined to low frequencies. This flow component can be derived from the total optic flow between saccades because the residual intersaccadic head rotations are small and encoded at higher frequencies. Information about the spatial layout of the environment can thus be extracted by the neuron in a computationally parsimonious way. These results on neuronal function based on naturalistic, behaviourally generated optic flow are in stark contrast to conclusions based on conventional visual stimuli that the neuron primarily represents a detector for yaw rotations of the animal.
Insects such as flies or bees, with their miniature brains, are able to control highly aerobatic flight maneuvres and to solve spatial vision tasks, such as avoiding collisions with obstacles, landing on objects, or even localizing a previously learnt inconspicuous goal on the basis of environmental cues. With regard to solving such spatial tasks, these insects still outperform man-made autonomous flying systems. To accomplish their extraordinary performance, flies and bees have been shown by their characteristic behavioral actions to actively shape the dynamics of the image flow on their eyes (“optic flow”). The neural processing of information about the spatial layout of the environment is greatly facilitated by segregating the rotational from the translational optic flow component through a saccadic flight and gaze strategy. This active vision strategy thus enables the nervous system to solve apparently complex spatial vision tasks in a particularly efficient and parsimonious way. The key idea of this review is that biological agents, such as flies or bees, acquire at least part of their strength as autonomous systems through active interactions with their environment and not by simply processing passively gained information about the world. These agent-environment interactions lead to adaptive behavior in surroundings of a wide range of complexity. Animals with even tiny brains, such as insects, are capable of performing extraordinarily well in their behavioral contexts by making optimal use of the closed action–perception loop. Model simulations and robotic implementations show that the smart biological mechanisms of motion computation and visually-guided flight control might be helpful to find technical solutions, for example, when designing micro air vehicles carrying a miniaturized, low-weight on-board processor.
For many animals, including humans, the optic flow generated on the eyes during locomotion is an important source of information about self-motion and the structure of the environment. The blowfly has been used frequently as a model system for experimental analysis of optic flow processing at the microcircuit level. Here, we describe a model of the computational mechanisms implemented by these circuits in the blowfly motion vision pathway. Although this model was originally proposed based on simple experimenter-designed stimuli, we show that it is also capable to quantitatively predict the responses to the complex dynamic stimuli a blowfly encounters in free flight. In particular, the model visual system exploits the active saccadic gaze and flight strategy of blowflies in a similar way, as does its neuronal counterpart. The model circuit extracts information about translation velocity in the intersaccadic intervals and thus, indirectly, about the three-dimensional layout of the environment. By stepwise dissection of the model circuit, we determine which of its components are essential for these remarkable features. When accounting for the responses to complex natural stimuli, the model is much more robust against parameter changes than when explaining the neuronal responses to simple experimenter-defined stimuli. In contrast to conclusions drawn from experiments with simple stimuli, optimization of the parameter set for different segments of natural optic flow stimuli do not indicate pronounced adaptational changes of these parameters during long-lasting stimulation.
A high-speed panoramic visual stimulation device is introduced which is suitable to analyse visual interneurons during stimulation with rapid image displacements as experienced by fast moving animals. The responses of an identified motion sensitive neuron in the visual system of the blowfly to behaviourally generated image sequences are very complex and hard to predict from the established input circuitry of the neuron. This finding suggests that the computational significance of visual interneurons can only be assessed if they are characterised not only by conventional stimuli as are often used for systems analysis, but also by behaviourally relevant input.
Avoiding collisions is one of the most basic needs of any mobile agent, both biological and technical, when searching around or aiming toward a goal. We propose a model of collision avoidance inspired by behavioral experiments on insects and by properties of optic flow on a spherical eye experienced during translation, and test the interaction of this model with goal-driven behavior. Insects, such as flies and bees, actively separate the rotational and translational optic flow components via behavior, i.e. by employing a saccadic strategy of flight and gaze control. Optic flow experienced during translation, i.e. during intersaccadic phases, contains information on the depth-structure of the environment, but this information is entangled with that on self-motion. Here, we propose a simple model to extract the depth structure from translational optic flow by using local properties of a spherical eye. On this basis, a motion direction of the agent is computed that ensures collision avoidance. Flying insects are thought to measure optic flow by correlation-type elementary motion detectors. Their responses depend, in addition to velocity, on the texture and contrast of objects and, thus, do not measure the velocity of objects veridically. Therefore, we initially used geometrically determined optic flow as input to a collision avoidance algorithm to show that depth information inferred from optic flow is sufficient to account for collision avoidance under closed-loop conditions. Then, the collision avoidance algorithm was tested with bio-inspired correlation-type elementary motion detectors in its input. Even then, the algorithm led successfully to collision avoidance and, in addition, replicated the characteristics of collision avoidance behavior of insects. Finally, the collision avoidance algorithm was combined with a goal direction and tested in cluttered environments. The simulated agent then showed goal-directed behavior reminiscent of components of the navigation behavior of insects.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.