19Solitary foraging insects display stunning navigational behaviours in visually complex natural 20 environments. Current literature assumes that these insects are mostly driven by attractive 21 visual memories, which are learnt when the insect's gaze is precisely oriented toward the goal 22 direction, typically along its familiar route or towards its nest. That way, an insect could return 23 105 106This principle can be directly applied to recapitulate idiosyncratic routes, given a set of visual 107 memories stored along them. Agent-based simulations have shown that the 'walking forward 108 in the most familiar direction' rule enables agents to recapitulate routes (8,24,29,38) in a way 109 that closely resembles what is observed in solitary foraging ants (18,39,40). These models are 110 further supported by the well-choreographed 'learning walks' displayed by ants -as well as 111 the equivalent 'learning flights' displayed by bees and wasps (41-45) -around their nest (or a 112 food source). During learning walks, ants appear to systematically pause while facing towards 113 the nest (or a food source), presumably to memorise 'correctly aligned' views from multiple 114 locations around the goal (12,14,(46)(47)(48). Given such a visual memory bank, models show 115 that the exact same principle used for route following is able to produce a search centred on 116 the goal location (24,27). There again, the models' emerging search patterns capture several 117 aspects of the ants' visually driven search (38,46,49,50).
119The greatest appeal of familiarity-based models is that the comparison of the current and 120 memorised view needs only to output one familiarity value. To get this familiarity value, there 121 is no need for finding correspondence between parts of the visual field and a direction in the 122 world. This contrasts with so called 'correspondence models' where features must be 123 extracted from panoramas, and matched locally. Familiarity-based models do not need 124 retinotopic mapping to be preserved at the comparison stage. This fits well the connectivity 125 of the insect Mushroom Bodies (51), where memories of visual scenes are likely encoded 126 (8,9). Indeed, in visually navigating insects, the visual input as perceived through their low-127 resolution eyes and processed by early optic neural relays (52-54), is then projected to more 128 than 100,000 Mushroom Body cells (Kenyon Cells: KCs) through an apparently random 129 pattern of connections, likely disrupting retinotopic organisation (55-57). Remarkably, given 130 the synaptic plasticity observed at the output of the MB during learning (58-60), the resulting 131 activity of the MB output neurons naturally provides a measure of familiarity of the current 132 sensory input (8,51). 133 A simplified model of the ant and bees circuitry shows that the Mushroom Bodies provides 134 enough 'memory space' for an agent to store the required visual information to recapitulate a 135 10 meter long-route in a naturalistic environment -here again, by simply...