When tasting food, animals rely on chemical and tactile cues, which determine the animal's decision on whether to eat food. As food nutritional composition has enormous consequences for the survival of animals, food items should generally be tasted before they are eaten or collected for later consumption. Even though recent studies have confirmed the importance of, for example, gustatory cues, compared with olfaction only little is known about the representation of chemotactile stimuli at the receptor level (let alone higher brain centers) in animals other than vertebrates. To better understand how invertebrates may process chemotactile cues, we used bumblebees as a model species and combined electroantennographical (EAG) recordings with a novel technique for chemotactile antennal stimulation in bees. The recorded EAG responses to chemotactile stimulation clearly separated volatile compounds by both compound identity and concentration, and could be successfully applied to test the receptor activity evoked by different types of pollen. We found that two different pollen types (apple and almond; which were readily distinguished by bumblebees in a classical conditioning task) evoked significantly distinct neural activity already at the antennal receptor level. Our novel stimulation technique therefore enables investigation of chemotactile sensing, which is highly important for assessing food nutritional quality while foraging. It can further be applied to test other chemosensory behaviors, such as mate or nest mate recognition, or to investigate whether toxic substances, e.g. in pollen, affect neuronal separation of different food types.
Multisensory integration plays a central role in perception, as all behaviors usually require the input of different sensory signals. For instance, for a foraging honeybee the association of a food source includes the combination of olfactory and visual cues to be categorized as a flower. Moreover, homing after successful foraging using celestial cues and the panoramic scenery may be dominated by visual cues. Hence, dependent on the context, one modality might be leading and influence the processing of other modalities. To unravel the complex neural mechanisms behind this process we studied honeybee mushroom body output neurons (MBON). MBONs represent the first processing level after olfactory-visual convergence in the honeybee brain. This was physiologically confirmed in our previous study by characterizing a subpopulation of multisensory MBONs. These neurons categorize incoming sensory inputs into olfactory, visual, and olfactory-visual information. However, in addition to multisensory units a prominent population of MBONs was sensitive to visual cues only. Therefore, we asked which visual features might be represented at this high-order integration level. Using extracellular, multi-unit recordings in combination with visual and olfactory stimulation, we separated MBONs with multisensory responses from purely visually driven MBONs. Further analysis revealed, for the first time, that visually driven MBONs of both groups encode detailed aspects within this individual modality, such as light intensity and light identity. Moreover, we show that these features are separated by different MBON subpopulations, for example by extracting information about brightness and wavelength. Most interestingly, the latter MBON population was tuned to separate UV-light from other light stimuli, which were only poorly differentiated from each other. A third MBON subpopulation was neither tuned to brightness nor to wavelength and encoded the general presence of light. Taken together, our results support the view that the mushroom body, a high-order sensory integration, learning and memory center in the insect brain, categorizes sensory information by separating different behaviorally relevant aspects of the multisensory scenery and that these categories are channeled into distinct MBON subpopulations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.