Path integration is a widespread navigational strategy in which directional changes and distance covered are continuously integrated on an outward journey, enabling a straight-line return to home. Bees use vision for this task-a celestial-cue-based visual compass and an optic-flow-based visual odometer-but the underlying neural integration mechanisms are unknown. Using intracellular electrophysiology, we show that polarized-light-based compass neurons and optic-flow-based speed-encoding neurons converge in the central complex of the bee brain, and through block-face electron microscopy, we identify potential integrator cells. Based on plausible output targets for these cells, we propose a complete circuit for path integration and steering in the central complex, with anatomically identified neurons suggested for each processing step. The resulting model circuit is thus fully constrained biologically and provides a functional interpretation for many previously unexplained architectural features of the central complex. Moreover, we show that the receptive fields of the newly discovered speed neurons can support path integration for the holonomic motion (i.e., a ground velocity that is not precisely aligned with body orientation) typical of bee flight, a feature not captured in any previously proposed model of path integration. In a broader context, the model circuit presented provides a general mechanism for producing steering signals by comparing current and desired headings-suggesting a more basic function for central complex connectivity, from which path integration may have evolved.
Oriented behaviour is present in almost all animals, indicating that it is an ancient feature that has emerged from animal brains hundreds of millions of years ago. Although many complex navigation strategies have been described, each strategy can be broken down into a series of elementary navigational decisions. In each moment in time an animal has to compare its current heading with its desired direction and compensate for any mismatch by producing a steering response either to the right or to the left. Different from reflex driven movements, target directed navigation is not only initiated in response to sensory input, but also takes into account previous experience and motivational state. Once a series of elementary decisions are chained together to form one of many coherent navigation strategies, the animal can pursue a navigational target, e.g. a food source, a nest entrance, or a constant flight direction during migrations. Insects show a great variety of complex navigation behaviours and, owing to their small brains, the pursuit of the neural circuits controlling navigation has made substantial progress over the last years. A brain region as ancient as insects themselves, called the central complex, has emerged as the likely navigation centre of the brain. Research across many species has shown that the central complex contains the circuitry that might comprise the neural substrate of elementary navigational decisions. While this region is also involved in a wide range of other functions, we hypothesise in this review that its role in mediating the animal’s next move during target directed behaviour is its ancestral function, around which other functions have been layered over the course of evolution.
Reliable vision in dim light depends on the efficient capture of photons. Moreover, visually guided behaviour requires reliable signals from the photoreceptors to generate appropriate motor reactions. Here, we show that at behavioural low-light threshold, cockroach photoreceptors respond to moving gratings with single-photon absorption events known as 'quantum bumps' at or below the rate of 0.1 s −1 . By performing behavioural experiments and intracellular recordings from photoreceptors under identical stimulus conditions, we demonstrate that continuous modulation of the photoreceptor membrane potential is not necessary to elicit visually guided behaviour. The results indicate that in cockroach motion detection, massive temporal and spatial pooling takes place throughout the eye under dim conditions, involving currently unknown neural processing algorithms. The extremely high night-vision capability of the cockroach visual system provides a roadmap for bio-mimetic imaging design.
Abstract. Ectoedemia argyropeza (Zeller, 1839) possesses a compound eye that exhibits features of both apposition and superposition type eyes. Like apposition eyes, the eye of E. argyropeza lacks a clear-zone, which in superposition eyes separates the distal dioptric from the proximal light-perceiving structures. On the other hand, a tracheal layer around the proximal ends of the rhabdom as well as a well-developed corneal nipple array on the corneal surfaces are features that E. argyropeza shares with the larger moths. Unique, and so far only seen to this extreme degree in any insect, is the hourglass-shape of E. argyropeza's rhabdom, in which two almost equally voluminous regions (one distal, one proximal and formed in both cases by seven rhabdomeres) are connected by a narrow waist-like region of the retinula. An eighth retinula cell, not participating in rhabdom formation, is developed as a basal cell, just above the basement membrane. The eye responds with photomechanical changes to dark/light adaptation, but while the proximal rhabdom moiety slightly expands (as expected) in the dark, the distal rhabdom increases its diameter only upon light-adaptation. Owing to the tandem position of the two rhabdom moities, it is in the light-adapted state that the distally-placed rhabdom is favoured, while the proximal rhabdom plays a more important role at low ambient light levels. With screening pigments withdrawn, tracheal tapetum exposed, and distal rhabdom diameters reduced, the proximal and in the dark enlarged rhabdom is then in a position to capture photons that have entered the eye through not only the ommatidial window above, but other facets as well even in the absence of a clear-zone and superposition optics.
Ideally, neuronal functions would be studied by performing experiments with unconstrained animals whilst they behave in their natural environment. Although this is not feasible currently for most animal models, one can mimic the natural environment in the laboratory by using a virtual reality (VR) environment. Here we present a novel VR system based upon a spherical projection of computer generated images using a modified commercial data projector with an add-on fish-eye lens. This system provides equidistant visual stimulation with extensive coverage of the visual field, high spatio-temporal resolution and flexible stimulus generation using a standard computer. It also includes a track-ball system for closed-loop behavioural experiments with walking animals. We present a detailed description of the system and characterize it thoroughly. Finally, we demonstrate the VR system’s performance whilst operating in closed-loop conditions by showing the movement trajectories of the cockroaches during exploratory behaviour in a VR forest.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.