Many species of fish rely on their visual systems to interact with conspecifics and these interactions can lead to collective behavior. Individual-based models have been used to predict collective interactions; however, these models generally make simplistic assumptions about the sensory systems that are applied without proper empirical testing to different species. This could limit our ability to predict (and test empirically) collective behavior in species with very different sensory requirements. In this study, we characterized components of the visual system in two species of cyprinid fish known to engage in visually dependent collective interactions (zebrafish Danio rerio and golden shiner Notemigonus crysoleucas) and derived quantitative predictions about the positioning of individuals within schools. We found that both species had relatively narrow binocular and blind fields and wide visual coverage. However, golden shiners had more visual coverage in the vertical plane (binocular field extending behind the head) and higher visual acuity than zebrafish. The centers of acute vision (areae) of both species projected in the fronto-dorsal region of the visual field, but those of the zebrafish projected more dorsally than those of the golden shiner. Based on this visual sensory information, we predicted that: (a) predator detection time could be increased by >1,000% in zebrafish and >100% in golden shiners with an increase in nearest neighbor distance, (b) zebrafish schools would have a higher roughness value (surface area/volume ratio) than those of golden shiners, (c) and that nearest neighbor distance would vary from 8 to 20 cm to visually resolve conspecific striping patterns in both species. Overall, considering between-species differences in the sensory system of species exhibiting collective behavior could change the predictions about the positioning of individuals in the group as well as the shape of the school, which can have implications for group cohesion. We suggest that more effort should be invested in assessing the role of the sensory system in shaping local interactions driving collective behavior.
Avian species vary in their visual system configuration, but previous studies have often compared single visual traits between two to three distantly related species. However, birds use different visual dimensions that cannot be maximized simultaneously to meet different perceptual demands, potentially leading to trade-offs between visual traits. We studied the degree of inter-specific variation in multiple visual traits related to foraging and anti-predator behaviors in nine species of closely related emberizid sparrows, controlling for phylogenetic effects.
Collective behaviour models can predict behaviours of schools, flocks, and herds. However, in many cases, these models make biologically unrealistic assumptions in terms of the sensory capabilities of the organism, which are applied across different species. We explored how sensitive collective behaviour models are to these sensory assumptions. Specifically, we used parameters reflecting the visual coverage and visual acuity that determine the spatial range over which an individual can detect and interact with conspecifics. Using metric and topological collective behaviour models, we compared the classic sensory parameters, typically used to model birds and fish, with a set of realistic sensory parameters obtained through physiological measurements. Compared with the classic sensory assumptions, the realistic assumptions increased perceptual ranges, which led to fewer groups and larger group sizes in all species, and higher polarity values and slightly shorter neighbour distances in the fish species. Overall, classic visual sensory assumptions are not representative of many species showing collective behaviour and constrain unrealistically their perceptual ranges. More importantly, caution must be exercised when empirically testing the predictions of these models in terms of choosing the model species, making realistic predictions, and interpreting the results.
Animals move their heads and eyes to compensate for movements of the body and background, search, fixate, and track objects visually. Avian saccadic head/eye movements have been shown to vary considerably between species. We tested the hypothesis that the configuration of the retina (i.e., changes in retinal ganglion cell density from the retinal periphery to the center of acute vision-fovea) would account for the inter-specific variation in avian head/eye movement behavior. We characterized retinal configuration, head movement rate, and degree of eye movement of 29 bird species with a single fovea, controlling for the effects of phylogenetic relatedness. First, we found the avian fovea is off the retinal center towards the dorso-temporal region of the retina. Second, species with a more pronounced rate of change in ganglion cell density across the retina generally showed a higher degree of eye movement and higher head movement rate likely because a smaller retinal area with relatively high visual acuity leads to greater need to move the head/eye to align this area that contains the fovea with objects of interest. Our findings have implications for anti-predator behavior, as many predator-prey interaction models assume that the sensory system of prey (and hence their behavior) varies little between species.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.