The need for automated and efficient systems for tracking full animal pose has increased with the complexity of behavioral data and analyses. Here we introduce LEAP (LEAP estimates animal pose), a deep-learning-based method for predicting the positions of animal body parts. This framework consists of a graphical interface for labeling of body parts and training the network. LEAP offers fast prediction on new data, and training with as few as 100 frames results in 95% of peak performance. We validated LEAP using videos of freely behaving fruit flies and tracked 32 distinct points to describe the pose of the head, body, wings and legs, with an error rate of <3% of body length. We recapitulated reported findings on insect gait dynamics and demonstrated LEAP’s applicability for unsupervised behavioral classification. Finally, we extended the method to more challenging imaging situations and videos of freely moving mice.
The E-Cadherin-catenin complex plays a critical role in epithelial cell-cell adhesion, polarization, and morphogenesis. Here, we have analyzed the mechanism of Drosophila E-Cadherin (DE-Cad) localization. Loss of function of the Drosophila exocyst components sec5, sec6, and sec15 in epithelial cells results in DE-Cad accumulation in an enlarged Rab11 recycling endosomal compartment and inhibits DE-Cad delivery to the membrane. Furthermore, Rab11 and Armadillo interact with the exocyst components Sec15 and Sec10, respectively. Our results support a model whereby the exocyst regulates DE-Cadherin trafficking, from recycling endosomes to sites on the epithelial cell membrane where Armadillo is located.
The generation of acoustic communication signals is widespread across the animal kingdom, and males of many species, including Drosophilidae, produce patterned courtship songs to increase their chance of success with a female. For some animals, song structure can vary considerably from one rendition to the next; neural noise within pattern generating circuits is widely assumed to be the primary source of such variability, and statistical models that incorporate neural noise are successful at reproducing the full variation present in natural songs. In direct contrast, here we demonstrate that much of the pattern variability in Drosophila courtship song can be explained by taking into account the dynamic sensory experience of the male. In particular, using a quantitative behavioural assay combined with computational modelling, we find that males use fast modulations in visual and self-motion signals to pattern their songs, a relationship that we show is evolutionarily conserved. Using neural circuit manipulations, we also identify the pathways involved in song patterning choices and show that females are sensitive to song features. Our data not only demonstrate that Drosophila song production is not a fixed action pattern, but establish Drosophila as a valuable new model for studies of rapid decision-making under both social and naturalistic conditions.
The desire to understand how the brain generates and patterns behavior has driven rapid methodological innovation in tools to quantify natural animal behavior. While advances in deep learning and computer vision have enabled markerless pose estimation in individual animals, extending these to multiple animals presents unique challenges for studies of social behaviors or animals in their natural environments. Here we present Social LEAP Estimates Animal Poses (SLEAP), a machine learning system for multi-animal pose tracking. This system enables versatile workflows for data labeling, model training and inference on previously unseen data. SLEAP features an accessible graphical user interface, a standardized data model, a reproducible configuration system, over 30 model architectures, two approaches to part grouping and two approaches to identity tracking. We applied SLEAP to seven datasets across flies, bees, mice and gerbils to systematically evaluate each approach and architecture, and we compare it with other existing approaches. SLEAP achieves greater accuracy and speeds of more than 800 frames per second, with latencies of less than 3.5 ms at full 1,024 × 1,024 image resolution. This makes SLEAP usable for real-time applications, which we demonstrate by controlling the behavior of one animal on the basis of the tracking and detection of social interactions with another animal.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.