2019
DOI: 10.1152/jn.00301.2019
|View full text |Cite
|
Sign up to set email alerts
|

A passive, camera-based head-tracking system for real-time, three-dimensional estimation of head position and orientation in rodents

Abstract: Tracking head position and orientation in small mammals is crucial for many applications in the field of behavioral neurophysiology, from the study of spatial navigation to the investigation of active sensing and perceptual representations. Many approaches to head tracking exist, but most of them only estimate the 2D coordinates of the head over the plane where the animal navigates. Full reconstruction of the pose of the head in 3D is much more more challenging and has been achieved only in handful of studies,… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
15
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
1

Relationship

3
5

Authors

Journals

citations
Cited by 13 publications
(16 citation statements)
references
References 96 publications
0
15
0
Order By: Relevance
“…They should have equal complexity as the category stimuli; however, they should not resemble the category exemplars because this may cause interference or generalization. Controlling for changes in motor behavior can also be obtained by cameras that track head and eye movements 87,88 …”
Section: Experimental Designmentioning
confidence: 99%
“…They should have equal complexity as the category stimuli; however, they should not resemble the category exemplars because this may cause interference or generalization. Controlling for changes in motor behavior can also be obtained by cameras that track head and eye movements 87,88 …”
Section: Experimental Designmentioning
confidence: 99%
“…In the last 20 years, multiple research groups have reported increasingly sophisticated, easy-to-use tools for automated annotation of video data of animal behavior ( Knutsen et al, 2005 ; Voigts et al, 2008 ; Perkon et al, 2011 ; Clack et al, 2012 ; Ohayon et al, 2013 ; Giovannucci et al, 2018 ; Dominiak et al, 2019 ; Vanzella et al, 2019 ; Betting et al, 2020 ; Petersen et al, 2020 ). One natural extension of this ability has been to apply these algorithms for on-line, closed-loop paradigms, where changes in behavior of the animal are detected as rapidly as possible, and the behavior is used to modify or manipulate the brain, the virtual environment or the context of behavior.…”
Section: Discussionmentioning
confidence: 99%
“…Traditionally, video data have been analyzed manually. More recently, various algorithms have been developed for automating movement detection ( Knutsen et al, 2005 ; Voigts et al, 2008 ; Perkon et al, 2011 ; Clack et al, 2012 ; Ohayon et al, 2013 ; Giovannucci et al, 2018 ; Dominiak et al, 2019 ; Vanzella et al, 2019 ; Betting et al, 2020 ; Petersen et al, 2020 ). With the development of DeepLabCut, a marker-less pose-estimation toolkit based on deep learning ( Mathis et al, 2018 ), computer vision approaches are being used for monitoring poses of animals and for tracking the movement of virtually any part of the body.…”
Section: Introductionmentioning
confidence: 99%
“…Rats were tested using the high-throughput behavioral rig described in [15] and previously employed in several investigations of rat object recognition by our group [34,[36][37][38][39][40]. Briefly, the rig consists of six independent operant boxes, each equipped with a computer monitor for stimulus presentation and an array of three response ports for collection of behavioral responses.…”
Section: A Visual Priming Paradigm To Probe Spontaneous Perception Of Global Motion Directionmentioning
confidence: 99%
“…The behavioral rig was the same previously used in several studies of rat visual perception carried out by our group [34,[36][37][38][39][40]. It consisted of two racks, each equipped with three operant boxes to allow training a batch of six rats simultaneously [15,35].…”
Section: Behavioral Apparatus and Visual Stimulimentioning
confidence: 99%