2017
DOI: 10.1101/161232
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Virtual Reality system for freely-moving rodents

Abstract: Spatial navigation, active sensing, and most cognitive functions rely on a tight link between motor output and sensory input. Virtual reality (VR) systems simulate the sensorimotor loop, allowing flexible manipulation of enriched sensory input. Conventional rodent VR systems provide 3D visual cues linked to restrained locomotion on a treadmill, leading to a mismatch between visual and most other sensory inputs, sensory-motor conflicts, as well as restricted naturalistic behavior. To rectify these limitations, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
32
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 22 publications
(32 citation statements)
references
References 56 publications
0
32
0
Order By: Relevance
“…Finally, the superior colliculus, a key structure for controlling head and eye movements in non-human primates (Freedman et al, 1996), likely plays a major role in controlling these movements in rodents (Wang et al, 2015, Wilson et al, 2018, Masullo et al, 2019. Advanced techniques for detailed tracking of head and eye movement (Meyer et al, 2018, Voigts andHarnett, 2019) and virtual reality for visual stimulus control in freely behaving mice (Stowers et al, 2017, Del Grosso et al, 2017, can now be combined with powerful tools to measure and manipulate neural activity (Luo et al, 2008). This provides a unique opportunity to establish the neural circuits that underlie the dierent types of eye-head coupling.…”
Section: Brain Mechanismsmentioning
confidence: 99%
“…Finally, the superior colliculus, a key structure for controlling head and eye movements in non-human primates (Freedman et al, 1996), likely plays a major role in controlling these movements in rodents (Wang et al, 2015, Wilson et al, 2018, Masullo et al, 2019. Advanced techniques for detailed tracking of head and eye movement (Meyer et al, 2018, Voigts andHarnett, 2019) and virtual reality for visual stimulus control in freely behaving mice (Stowers et al, 2017, Del Grosso et al, 2017, can now be combined with powerful tools to measure and manipulate neural activity (Luo et al, 2008). This provides a unique opportunity to establish the neural circuits that underlie the dierent types of eye-head coupling.…”
Section: Brain Mechanismsmentioning
confidence: 99%
“…saccadic suppression, Duy and Burchel, 1975), for the computation of the mismatch between sensory input and expected input (Keller et al, 2012), or for the integration of sensory inputs with signals related to spatial navigation (Saleem et al, 2013). We anticipate that important progress can be made by combining our method with new tools for virtual reality in freely moving animals (Stowers et al, 2017;Del Grosso et al, 2017) to provide both detailed behavioral and stimulus control.…”
Section: Discussionmentioning
confidence: 99%
“…Arena, Projection Optical Tracking, Active treadmill Fruit fly -Real-time 3D tracking [32], vision induced motion [72], Flight pattern [73] Spider -Navigation [64,72], Ant -Foraging [18], Fish -Social behavior [73], Rodent- [20,73], Review -Neuroscience [8,27], VR for animals [72], Rodent - [80] Free moving animals, real time perspective correction, underwater projection, arbitrary surfaced arena, support for multiple species, configurable software. authors claimed that they were able to study depth perception in fruit flies with the system which was not possible in previously designed open-loop methods.…”
Section: Digitalmentioning
confidence: 99%