Visual navigation in ants: from visual ecology to brain

Lead Research Organisation: University of Sussex
Department Name: Sch of Life Sciences

Abstract

All animals have a basic sense of direction, but most can also learn cues in their environment to enable them to navigate between familiar locations. For both humans and ants, these learnt guidance cues are primarily visual, and autonomous robots are being developed that use similar cues. Our goal is to understand how the tiny brain of the ant is capable of supporting navigational feats - without GPS - that are superior to any current robots, and often better than humans.

Ants have strong evolutionary pressure to be successful navigators, as to survive they need to forage for food and bring it back to their nest. They have been shown to rapidly learn visual cues surrounding nest and food locations, and to develop memories for long routes through complex terrain between these key locations. But what is stored in memory, and how is used to guide behaviour? Alternative hypotheses include: A) they store 'snapshots' of the surrounding scene from a particular point of view and try to match these when returning to the same place; B) they detect prominent landmarks or features, and use those locations to triangulate their position; C) they process the visual scene into a compact and robust internal representation that they can use flexibly to recognise their current location and to determine the correct course to a goal.

We will use a mixture of experiments and modelling on wood ants to investigate these possibilities. Our approaches will include 'top-down' analysis of what information is actually available in the natural scenery of the wood ant habitat for navigation, and bottom-up investigation of which brain areas seem to be crucial in navigational tasks. The latter approach is motivated by evidence from other insects that memory of patterns (consistent with hypothesis A) are stored in a brain area called the mushroom bodies, whereas abstracted directional information (consistent with hypothesis C) is processed in a different area, the central complex. We will carry out the first ever experiments to test if brain lesions in these areas affect visual navigation in ants.

We will also develop a novel treadmill system in which ants can be placed on a sphere, and walk freely while their forward and rotational motion is compensated to keep them in the same position and orientation. This will allow us to test ants in a virtual world in which we can independently manipulate different parts or properties of the visual scene, and track the immediate effect on navigational decisions. This work will be complemented by developing an equivalent 'virtual ant' simulation that can be tested with the same experimental stimuli. The brain of our virtual ant will contain computational algorithms corresponding to the hypotheses above, so that we can predict what the real ant should do if that hypothesis is correct. We will also test neural network models corresponding to the brain circuits tested in the lesioning experiments.

Understanding the ant brain should give us insight into navigational mechanisms used by other animals, including humans, and also suggest new solutions for technology.

Technical Summary

We will use complementary methodological approaches to understand the nature of the visual memory that supports navigation in ants. We will develop procedures for carrying out brain lesions in ants, targeting a range of locations in central complex and mushroom body, and investigate the consequences in visual navigation tasks. Subsequent histology will allow us to correlate lesion locations with behavioural deficits. In parallel, we will establish a new experimental system using a compensatory treadmill to allow precise control over the visual stimulation provided to freely walking ants. This method will enable extremely rapid and minimally invasive transfer of ants from a conventional arena training paradigm to this controlled testing paradigm, supporting high-throughput experiments. These experimental methods will be coupled with analytical approaches to the information content in natural scenes from the ant habitat, to refine the stimulus paradigms and provide realistic input to computational models. An agent model (a simulated ant moving through a virtual world) will allow us to test specific algorithms for visual navigation under precisely parallel conditions to the animal, and thus allow us to devise crucial paradigms for the experimental system under which alternative models make different predictions. In particular, we will examine what are the critical eye regions, the essential image information content, and the most efficient and effective encoding and retrieval schemes to account for navigational behaviour. In the same agent model, we will also test more detailed models of the relevant brain circuitry, to understand how it could support such processing, and close the loop with predictions for new trackball and lesion studies and potential extensions towards single-cell electrophysiology of neurons in relevant brain regions.

Planned Impact

Insect-inspired visual algorithms have many potential applications in technology, as they offer efficient and economical solutions to real-world problems, including detection and tracking, visual stabilisation, impact prediction, and the focus of our study, navigation. Sensor and computational systems directly derived from research on insect behaviour and neurobiology have been applied to car safety systems, autonomous air vehicles and robotics. In most cases this has involved relatively simple reflex behaviours, where biological knowledge of the mechanisms is better established. We plan through the current research to make a significant advance in understanding the more complex algorithms that allow insects to navigate robustly and reliably, and will actively explore the potential for transfer to engineering in two main domains.

1) Supporting human navigation through the development of apps for mobile devices. While satnav has revolutionised human way-finding, there are still a number of scenarios in which it is unavailable or inaccurate, or simply not appropriate to the task, such as remembering routes through corridors in a multi-story building. This is both an everyday problem and one that is important in specific contexts such as hospitals where age-related memory loss complicates an already challenging task. An insect-inspired approach has a number of advantages in this context: it does not rely on object recognition; it is relies on the global scene and is thus relatively robust to local change; it does not involve hefty computation hence is viable to implement on small cheap devices; it depends largely on a single visual input (i.e. without direct depth information) so can exploit the ubiquity of cameras on hand-held devices. In the Pathways to Impact we describe our specific plans to develop and evaluate such a mobile phone app.

2) Supporting robot and autonomous vehicle navigation. Our intent here is not to challenge the current paradigm in autonomous car and conventional robotic development which relies on building complete 3D representations of the environment to plan motion, but rather to focus on systems that are suitable for small, cheap robots with relative limited computational power. A target application is agriculture, which is recognised as an industry with huge potential to benefit from increased automation if the problems of dealing with complex, cluttered natural terrain (very different to road systems) can be solved. Many of the advantages described above for the insect-inspired approach translate equally to this scenario, with the additional advantage that the navigational solutions of insects evolved for precisely such environments. Both Edinburgh and Sussex are already active in robot research and have excellent contacts and infrastructure to take this work forward, as described in the Pathways to Impact. Solutions in this domain may generalise to other important areas such as environmental monitoring and clean-up.
Both the technologies described above have obvious potential for societal benefits and link to RCUK national priorities. In addition, ant navigation research is a very effective topic in public communication of science, as the problem faced by the animals is easy to present and imagine, but the abilities exhibited by this 'mere insect' highly surprising and impressive. We already have extensive involvement in outreach activities that present the 'navigation challenge' faced by ants to humans and use this to generate interest and understanding in biology and computation; with an additional side-effect of providing data on human navigation. As described in the Pathways to Impact, we plan to expand this approach to a web-based citizen science experiment. This will provide a better transfer of information than traditional science outreach through both higher quality engagement and higher quantity involvement.

Publications

10 25 50
publication icon
Buehlmann C (2020) Multimodal interactions in insect navigation. in Animal cognition

publication icon
Stankiewicz J (2021) Looking down: a model for visual route following in flying insects in Bioinspiration & Biomimetics

publication icon
Vega Vermehren JA (2020) Multimodal influences on learning walks in desert ants (Cataglyphis fortis). in Journal of comparative physiology. A, Neuroethology, sensory, neural, and behavioral physiology

publication icon
Buehlmann C (2023) Impact of central complex lesions on innate and learnt visual navigation in ants. in Journal of comparative physiology. A, Neuroethology, sensory, neural, and behavioral physiology

publication icon
D Fernandes AS (2018) Visual Classical Conditioning in Wood Ants. in Journal of visualized experiments : JoVE

publication icon
David Fernandes A (2020) Lateralization of short- and long-term visual memories in an insect in Proceedings of the Royal Society B: Biological Sciences

publication icon
Niven JE (2018) Insights into the evolution of lateralization from the insects. in Progress in brain research

 
Description We have established that it is possible to get ants to walk for extended periods (an hour or more) on a motion compensator designed and built as part of this project. This allows novel experiments to be undertaken.

We have also shown it is possible to make targetted lesions of specific brain areas in ants and observe behavioural effects that are consistent with our models of visual navigation.

We have demonstrated for the first time that the Mushroom Bodies of the ant brain are necessary for visual navigation using learned visual information.

We have demonstrated that lateralized lesions to the central complex of ants impacts on turning behaviour during active visual guidance

We have developed a novel model of the interaction between Mushroom Body and Central Complex which can account for innate and learned visual behaviour and the interaction between the two.
Exploitation Route Both the motion compensator and lesion methodologies should be of use to other researchers in this field. We have also collected natural image statistics sets that might be useful for other researchers.

Intellectually, are novel results have changed the landscape of the study of insect navigation.
Sectors Agriculture, Food and Drink,Digital/Communication/Information Technologies (including Software),Manufacturing, including Industrial Biotechology

 
Description Emergent embodied cognition in shallow, biological and artificial, neural networks
Amount £200,036 (GBP)
Funding ID BB/X01343X/1 
Organisation Biotechnology and Biological Sciences Research Council (BBSRC) 
Sector Public
Country United Kingdom
Start 02/2023 
End 08/2024
 
Title Ant's Eye Image Database 
Description The data consists of 200 images taken from an ant's perspective on the ground in Abbot's Wood Sussex UK. The images are from a fisheye camera (Kodak PixPro) and represent a hemisphere of the visual world. The images will be used to investigate optimal visual encodings that could be implemented by insects as part of their visual navigation. This data was collected as part of projects funded by Engineering and Physical Sciences Research Council, under grant no. EP/P006094/1, Brains on Board and the BBSRC "Visual navigation in ants: from visual ecology to brain". BB/R005036/1 
Type Of Material Database/Collection of data 
Year Produced 2018 
Provided To Others? No  
Impact The database will allow the testing of navigation algorithms 
URL https://sussex.figshare.com/s/84e08b50778830b574d8
 
Title Data for research article "Innate visual attraction in wood ants is a hardwired behaviour seen across different motivational and ecological contexts" 
Description Data for paper published on bioRxiv Feb 2021 (Pre-print) Data contains paths from individually recorded ants during the experiments (saved as Matlab files). You will need access to the MATLAB environment to view these files. For details please see the README.txt file and view the experiment methods in the paper. Abstract:Ants are expert navigators combing innate and learnt navigational strategies. Whereas we know that the ants' feeding state segregates visual navigational memories in ants navigating along a learnt route, it is an open question if the motivational state also affects the ants' innate visual preferences. Wood ant foragers show an innate attraction to conspicuous visual cues. These foragers inhabit cluttered woodland habitat and feed on honeydew from aphids on trees, hence, the attraction to 'tree-like' objects might be an ecologically relevant behaviour that is tailored to the wood ants' foraging ecology. Foragers from other ant species with different foraging ecologies show very different innate attractions. We investigated here the innate visual response of wood ant foragers with different motivational states, i.e. unfed or fed, as well as males that have a short life span and show no foraging activity. Our results show that ants from all three groups orient towards a prominent visual cue, i.e. the wood ants' innate visual attraction is not context dependent, but a hardwired behaviour seen across different motivational and ecological contexts. 
Type Of Material Database/Collection of data 
Year Produced 2021 
Provided To Others? Yes  
URL https://sussex.figshare.com/articles/dataset/Data_for_research_article_Innate_visual_attraction_in_w...
 
Title Data for research article "Innate visual attraction in wood ants is a hardwired behaviour seen across different motivational and ecological contexts" 
Description Data for paper published on bioRxiv Feb 2021 (Pre-print) Data contains paths from individually recorded ants during the experiments (saved as Matlab files). You will need access to the MATLAB environment to view these files. For details please see the README.txt file and view the experiment methods in the paper. Abstract:Ants are expert navigators combing innate and learnt navigational strategies. Whereas we know that the ants' feeding state segregates visual navigational memories in ants navigating along a learnt route, it is an open question if the motivational state also affects the ants' innate visual preferences. Wood ant foragers show an innate attraction to conspicuous visual cues. These foragers inhabit cluttered woodland habitat and feed on honeydew from aphids on trees, hence, the attraction to 'tree-like' objects might be an ecologically relevant behaviour that is tailored to the wood ants' foraging ecology. Foragers from other ant species with different foraging ecologies show very different innate attractions. We investigated here the innate visual response of wood ant foragers with different motivational states, i.e. unfed or fed, as well as males that have a short life span and show no foraging activity. Our results show that ants from all three groups orient towards a prominent visual cue, i.e. the wood ants' innate visual attraction is not context dependent, but a hardwired behaviour seen across different motivational and ecological contexts. 
Type Of Material Database/Collection of data 
Year Produced 2021 
Provided To Others? Yes  
URL https://sussex.figshare.com/articles/dataset/Data_for_research_article_Innate_visual_attraction_in_w...
 
Title A motion compensation treadmill for untethered wood ants 
Description The natural scale of insect navigation during foraging makes it challenging to study under controlled conditions. Virtual reality and trackball setups have offered experimental control over visual environments while studying tethered insects, but potential limitations and confounds introduced by tethering motivates the development of alternative untethered solutions. In this paper, we validate the use of a motion compensator (or 'treadmill') to study visually driven behaviour of freely moving wood ants (Formica rufa). We show how this setup allows naturalistic walking behaviour and preserves foraging motivation over long time frames. Furthermore, we show that ants are able to transfer associative and navigational memories from classical maze and arena contexts to our treadmill. Thus, we demonstrate the possibility to study navigational behaviour over ecologically relevant durations (and virtual distances) in precisely controlled environments, bridging the gap between natural and highly controlled laboratory experiments. 
Type Of Technology Systems, Materials & Instrumental Engineering 
Year Produced 2019 
Impact This technology has allowed for a new generation of behavioural experiments with visual navigating wood ants. 
URL https://jeb.biologists.org/content/223/24/jeb228601