Arrays of up to six broadband suction cup hydrophones were placed on the forehead of two bottlenose dolphins to determine the location where the beam axis emerges and to examine how signals in the acoustic near-field relate to signals in the far-field. Four different array geometries were used; a linear one with hydrophones arranged along the midline of the forehead, and two around the front of the melon at 1.4 and 4.2 cm above the rostrum insertion, and one across the melon in certain locations not measured by other configurations. The beam axis was found to be close to the midline of the melon, approximately 5.4 cm above the rostrum insert for both animals. The signal path coincided with the low-density, low-velocity core of the melon; however, the data suggest that the signals are focused mainly by the air sacs. Slight asymmetry in the signals were found with higher amplitudes on the right side of the forehead. Although the signal waveform measured on the melon appeared distorted, when they are mathematically summed in the far-field, taking into account the relative time of arrival of the signals, the resultant waveform matched that measured by the hydrophone located at 1 m.
Animals rely on sensory feedback from their environment to guide locomotion. For instance, visually guided animals use patterns of optic flow to control their velocity and to estimate their distance to objects (e.g., Srinivasan et al., 1991, 1996). In this study, we investigated how acoustic information guides locomotion of animals that use hearing as a primary sensory modality to orient and navigate in the dark, where visual information is unavailable. We studied flight and echolocation behaviors of big brown bats as they flew under infrared illumination through a corridor with walls constructed from a series of individual vertical wooden poles. The spacing between poles on opposite walls of the corridor was experimentally manipulated to create dense/sparse and balanced/imbalanced spatial structure. The bats’ flight trajectories and echolocation signals were recorded with high-speed infrared motion-capture cameras and ultrasound microphones, respectively. As bats flew through the corridor, successive biosonar emissions returned cascades of echoes from the walls of the corridor. The bats flew through the center of the corridor when the pole spacing on opposite walls was balanced and closer to the side with wider pole spacing when opposite walls had an imbalanced density. Moreover, bats produced shorter duration echolocation calls when they flew through corridors with smaller spacing between poles, suggesting that clutter density influences features of the bat’s sonar signals. Flight speed and echolocation call rate did not, however, vary with dense and sparse spacing between the poles forming the corridor walls. Overall, these data demonstrate that bats adapt their flight and echolocation behavior dynamically when flying through acoustically complex environments.
Animals enhance sensory acquisition from a specific direction by movements of head, ears, or eyes. As active sensing animals, echolocating bats also aim their directional sonar beam to selectively “illuminate” a confined volume of space, facilitating efficient information processing by reducing echo interference and clutter. Such sonar beam control is generally achieved by head movements or shape changes of the sound-emitting mouth or nose. However, lingual-echolocating Egyptian fruit bats, Rousettus aegyptiacus, which produce sound by clicking their tongue, can dramatically change beam direction at very short temporal intervals without visible morphological changes. The mechanism supporting this capability has remained a mystery. Here, we measured signals from free-flying Egyptian fruit bats and discovered a systematic angular sweep of beam focus across increasing frequency. This unusual signal structure has not been observed in other animals and cannot be explained by the conventional and widely-used “piston model” that describes the emission pattern of other bat species. Through modeling, we show that the observed beam features can be captured by an array of tongue-driven sound sources located along the side of the mouth, and that the sonar beam direction can be steered parsimoniously by inducing changes to the pattern of phase differences through moving tongue location. The effects are broadly similar to those found in a phased array—an engineering design widely found in human-made sonar systems that enables beam direction changes without changes in the physical transducer assembly. Our study reveals an intriguing parallel between biology and human engineering in solving problems in fundamentally similar ways.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.