2017
DOI: 10.1523/eneuro.0245-16.2017
|View full text |Cite
|
Sign up to set email alerts
|

Pixying Behavior: A Versatile Real-Time andPost HocAutomated Optical Tracking Method for Freely Moving and Head Fixed Animals

Abstract: Here, we describe an automated optical method for tracking animal behavior in both head-fixed and freely moving animals, in real time and offline. It takes advantage of an off-the-shelf camera system, the Pixy camera, designed as a fast vision sensor for robotics that uses a color-based filtering algorithm at 50 Hz to track objects. Using customized software, we demonstrate the versatility of our approach by first tracking the rostro-caudal motion of individual adjacent row (D1, D2) or arc whiskers (β, γ), or … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
38
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
4
1

Relationship

1
8

Authors

Journals

citations
Cited by 35 publications
(38 citation statements)
references
References 41 publications
0
38
0
Order By: Relevance
“…Measuring all of the body part positions from raw images is a challenging computer vision 52 problem. Previous attempts at automated body-part tracking in insects and mammals have 53 relied on either physically constraining the animal and having it walk on a spherical treadmill 17 54 or linear track 18 , applying physical markers to the animal 17,19 , or utilizing specialized equipment 55 such as depth cameras [20][21][22] , frustrated total internal reflection imaging 23,24 or multiple cameras 56 25 . Meanwhile, approaches designed to operate without constraining the natural space of 57 behaviors make use of image processing techniques that are sensitive to imaging conditions 58 and require manual correction even after full training 26 .…”
mentioning
confidence: 99%
“…Measuring all of the body part positions from raw images is a challenging computer vision 52 problem. Previous attempts at automated body-part tracking in insects and mammals have 53 relied on either physically constraining the animal and having it walk on a spherical treadmill 17 54 or linear track 18 , applying physical markers to the animal 17,19 , or utilizing specialized equipment 55 such as depth cameras [20][21][22] , frustrated total internal reflection imaging 23,24 or multiple cameras 56 25 . Meanwhile, approaches designed to operate without constraining the natural space of 57 behaviors make use of image processing techniques that are sensitive to imaging conditions 58 and require manual correction even after full training 26 .…”
mentioning
confidence: 99%
“…Compared to existing open source software for animal tracking, PolyTouch is the only software that enables rapid (i.e.~5.7 ms) close-loop feedback for behavior (see Table 1 for a detailed comparison [23,26,60]. A direct comparison of PolyTouch with a customized video system revealed that the accuracy of PolyTouch is similar to manual offline tracking by an experienced human observer ( Figure 2E-F In terms of processing speed, modern closed-loop systems trigger feedback within 3-40 ms for neural data [11,13] and 12-50 ms for behavioral data [15,16,42,59,60] . PolyTouch can be used to control neural activity based on animal behavior at a temporal scale faster than synaptic communication along sensory pathways [61][62][63][64][65][66] , and thus could be used to create artificial sensory and motor feedback in the context of rapidly evolving behavioral computations [43,67] .…”
Section: Comparison To Existing Tracking Systemsmentioning
confidence: 93%
“…Although these systems are fast and reliable, their spatial resolution is heavily limited by the number of sensors deployed. Despite the availability of various other sensors, including microwave based motion detectors [29] , ultrasonic microphones [30,31] , radiofrequency detectors [32] , global positioning systems [33] , and heat (infrared) sensors [34] , majority of existing tracking systems rely on video cameras, as they provide detailed images of whole bodies [4,[35][36][37][38] , [39] , individual limbs [40][41][42] , face and whisker motion [43] , and eye movements [44,45] . A major drawback, however, is that behavioral classification requires several image processing steps to detect and identify the object of interest which takes several tens of milliseconds using the current state-of-art algorithms and standard computing infrastructure.…”
Section: Introductionmentioning
confidence: 99%
“…Data used here was from animals that could move the maze smoothly. Trials were selected for analysis if the whiskers were visible, the paint was glowing uniformly, and if in the course of the trial, the view of the whiskers was not obstructed by the motion of the animal (Nashaat et al, 2017).…”
Section: Data Selection and Image Analysismentioning
confidence: 99%