2021
DOI: 10.1101/2021.04.30.442096
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Multi-animal pose estimation and tracking with DeepLabCut

Abstract: Estimating the pose of multiple animals is a challenging computer vision problem: frequent interactions cause occlusions and complicate the association of detected keypoints to the correct individuals, as well as having extremely similar looking animals that interact more closely than in typical multi-human scenarios. To take up this challenge, we build on DeepLabCut, a popular open source pose estimation toolbox, and provide high-performance animal assembly and tracking—features required for robust multi-anim… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
70
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 69 publications
(71 citation statements)
references
References 47 publications
0
70
1
Order By: Relevance
“…Identities are determined by gender-specific actions and propagated to adjacent frames based on behavioral continuity. Compared with the latest version of DeepLabCut, 31 our approach achieves more than two times better performance (see Figure S1F, percentage of correct keypoints [PCK]: 86.85% versus 36.75%), more than four times shorter training iterations to maximum performance (6,400 versus 25,600), and more than four times smaller mean error (137.5 versus 539.1 mm) in our behavior assay.…”
Section: Unbiased Social Behavior Quantification Via Deep Learningmentioning
confidence: 93%
See 1 more Smart Citation
“…Identities are determined by gender-specific actions and propagated to adjacent frames based on behavioral continuity. Compared with the latest version of DeepLabCut, 31 our approach achieves more than two times better performance (see Figure S1F, percentage of correct keypoints [PCK]: 86.85% versus 36.75%), more than four times shorter training iterations to maximum performance (6,400 versus 25,600), and more than four times smaller mean error (137.5 versus 539.1 mm) in our behavior assay.…”
Section: Unbiased Social Behavior Quantification Via Deep Learningmentioning
confidence: 93%
“…[24][25][26] Recently, several tools have further enabled keypoint detection using machine learning. [27][28][29][30][31] However, there is substantial performance decrease in multianimal scenarios, especially for overlapping objects commonly seen in social behavior. In this study, we developed a deep-learning-based pipeline together with a labeled dataset for high precision detection of interacting flies.…”
Section: Introductionmentioning
confidence: 99%
“…Compared to DeepLabCut, another open-source environment, which is capable of tracking multiple individuals (Mathis et al, 2018 ; Nath et al, 2019 ), our choice was based on the comparative ease to install SLEAP on local PCs and the stability of the GUI on our PC (Windows 10, processor: Intel®Xeon®CPU E3-1270 v5 @ 3.60GHz 3.60 GHz, working memory: 32GB RAM). However, our approach could also be implemented in DeepLabCut or similar constantly evolving tracking toolboxes (Lauer et al, 2021 ). To enhance the efficiency of model training and tracking, we decided to move to Colab.…”
Section: Discussionmentioning
confidence: 99%
“…For instance, DeepLabCut can be trained to recognize animal body parts in a wide range of scenarios as long as the video recordings are of sufficient quality. Moreover, DeepLabCut even allows the tracking of multiple animals at the same time (Lauer et al, 2021). In its current version, however, Visual Field Analysis allows the tracking of only one animal at a time, within an orthogonal arena.…”
Section: Discussionmentioning
confidence: 99%