2020
DOI: 10.1371/journal.pcbi.1007631
|View full text |Cite
|
Sign up to set email alerts
|

Opponent processes in visual memories: A model of attraction and repulsion in navigating insects’ mushroom bodies

Abstract: Solitary foraging insects display stunning navigational behaviours in visually complex natural environments. Current literature assumes that these insects are mostly driven by attractive visual memories, which are learnt when the insect's gaze is precisely oriented toward the goal direction, typically along its familiar route or towards its nest. That way, an insect could return home by simply moving in the direction that appears most familiar. Here we show using virtual reconstructions of natural environments… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
68
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
1
1

Relationship

2
6

Authors

Journals

citations
Cited by 58 publications
(70 citation statements)
references
References 106 publications
(210 reference statements)
2
68
0
Order By: Relevance
“…Views facing the nest may as well be included during learning and categorised as left, right or both, explaining why most ants facing their goal usually choose to turn in one particular Revisiting current questions in insect and robot navigation such as early exploration, route following and homing 20,[46][47][48][49] ; the integration of aversive memories 8,24,50 , path integration and views ( [51][52][53][54] or other sensory modalities ( 55-58 as well as seeking for underlying neural correlates [5][6][7] -with such a lateralised design as a framework promises an interesting research agenda.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Views facing the nest may as well be included during learning and categorised as left, right or both, explaining why most ants facing their goal usually choose to turn in one particular Revisiting current questions in insect and robot navigation such as early exploration, route following and homing 20,[46][47][48][49] ; the integration of aversive memories 8,24,50 , path integration and views ( [51][52][53][54] or other sensory modalities ( 55-58 as well as seeking for underlying neural correlates [5][6][7] -with such a lateralised design as a framework promises an interesting research agenda.…”
Section: Discussionmentioning
confidence: 99%
“…Previous studies assumed that ants memorise views while facing the goal [20][21][22] and anti-goal [23][24][25] ) directions, and that they must consequently align their body in these same directions to recognise a learnt view as familiar [26][27][28] . On the contrary, our modelling effort suggests that ants should rather recognise views based on whether the route direction stands on their 'left or right' rather than 'in front or behind'.…”
Section: The Recognition Of Familiar Views Triggers Compensatory Leftmentioning
confidence: 99%
“…Numerous variants of homing models have been described and tested in the last decades, be they using frequency-based [41], rotation invariant [26], brightness [30], skyline [27], or optic-flow representations [42], single snapshot [30], multi-snapshots [36,43], or attractive and repulsive views [38]. We sought first for parsimonious models to predict the bumblebees' search location in visual conflict situations.…”
Section: Homing Modelsmentioning
confidence: 99%
“…From all listed models, only the model proposed by Le Möel et al 2020 [38], uses several snapshots to guide the agent home. Hence, the other mentioned models [26,27,37,41] are likely to suffer from object occlusion as the B1 or CwN1 model do.…”
Section: Perspective On Other Models Enabling Homing Without View Rotmentioning
confidence: 99%
“…Equally, investigations of the real-life computational properties of navigation-relevant neural circuits are currently hampered by limitations in the way visual information can be presented in electrophysiology rigs (see e.g., Table 1 ). There are currently no projection devices that can convey the full information content of the spatial, spectral, and polarization signal patterns that characterize natural navigation environments; and lastly the navigational competence of insects is based on active learning processes (e.g., Collett and Zeil, 2018 ; Jayatilaka et al, 2018 ; Zeil and Fleischmann, 2019 ) and relies on the active comparison between remembered and currently experienced input patterns (e.g., Zeil, 2012 ; Le Möel and Wystrach, 2020 ; Murray et al, 2020 ). It is thus likely that the neural machinery underlying navigation is heavily state-, context- and activity-dependent, requiring closed-loop control of the visual scene by the insect and control by the experimenter over the experience (What has been learned?…”
Section: Introductionmentioning
confidence: 99%