Insect-inspired visually guided autonomous route navigation through natural environments

Lead Research Organisation: University of Sussex
Department Name: Sch of Engineering and Informatics

Abstract

Our overall objective is to develop algorithms for long distance route-based visual navigation through complex natural environments. Despite recent advances in autonomous navigation, especially in map-based simultaneous localisation and mapping (SLAM), the problem of guiding a return to a goal location through unstructured, natural terrain is an open issue and active area of research. Despite their small brains and noisy low resolution sensors, insects navigate through such environments with a level of performance that outstrips state-of-the-art robot algorithms. It is therefore natural to take inspiration from insects. There has been a history of bio-inspired navigation models in robotics but there are known components of insect behaviour yet to be incorporated into engineering solutions. In contrast with most modern robotic methods, to navigate between two locations, insects, use procedural route knowledge and not mental maps. An important feature of route navigation is that the agent does not need to know where it is at every point (in the sense of localizing itself within a cognitive map), but rather what it should do. Insects provide further inspiration for navigation algorithms through their innate behavioural adaptations which simplify navigation through unstructured, cluttered environments.One objective is to develop navigation algorithms which capture the elegance and desirable properties of insect homing strategies - robustness (in the face of natural environmental variation), parsimony (of mechanism and visual encoding), speed of learning (insects must learn from their first excursion) and efficacy (the simple scale over which insects forage). Prior to this we will bring together current insights regarding insect behaviour with novel technologies which allow us to recreate visual input from the perspective of foraging insects. This will lead to new tools for biologists and increase our understanding of insect navigation. In order to achieve these goals our Work Packages will be:WP1 Development of tools for reconstructing large-scale natural environments. We will adapt an existing panoramic camera system to enable reconstruction of the visual input experienced by foraging bees. Similarly, we will adapt new computer vision methods to enable us to build world models of the cluttered habitats of antsWP2 Investigation of optimal visual encodings for navigation. Using the world model developed in WP1, we will investigate the stability and performance of different ways of encoding a visual sceneWP3 Autonomous route navigation algorithms. We will test a recently developed model of route navigation and augment it for robust performance in natural environmentsOur approach in this project is novel and timely. The panoramic camera system has just been developed at Sussex. The methods for building world models have only recently become practical and have not yet been applied in this context. The proposed route navigation methodology is newly developed at Sussex and is based on insights of insect behaviour only recently observed. Increased knowledge of route navigation will be of interest to engineers and biologists. Parsimonious route-following algorithms will be of use in situations where an agent must reliably navigate between two locations, such as a robotic courier or search-and-rescue robot. Our algorithms also have potential broader applications such as improving guidance aids for the visually-impaired. Biologists and the wider academic community will be able to use the tools developed to gain an understanding of the visual input during behavioural experiments leading to a deeper understanding of target systems. There is specific current interest from Rothamsted Agricultural Institute who are interested in how changes in flight patterns affect visual input and navigational efficacy of honeybee foragers from colonies affected by factors like pesticides or at risk of colony collapse disorder.

Planned Impact

The cross-disciplinary work will be of interest to engineers and biologists in academia and industry as well as the general public and schools. It fits within EPSRCs Control Systems and Robot Engineering area and Cross-Disciplinary programme. Academia: Robust outdoor route navigation is an active area of research and our algorithms will be of interest to engineers and computer scientists working on ground-based and airborne guidance. Moreover, tools for reconstructing environments will allow these groups to test out their own models, while adaptation of Dense Scene Reconstruction will interest those working on 3D depth map reconstruction. Similarly, stable visual feature extraction in natural habitats will aid feature selection and data association problems in robotics and computer vision applications more generally. Biologists working on insect visual behaviour will benefit from software tools and data from the project by reconstructing sensory input experienced during their experiments, thus tying sensation to action. We will also produce testable hypotheses on the visual features and algorithms used for navigation. Specifically, Rothamsted Agricultural Institute will use our tools to interpret the effect of altered flight patterns of honeybees from infected colonies. More generally, as the navigational strategies of insects resemble, to a surprising extent, those of animals with much larger brains, a similarity likely to have arisen through convergent evolution of navigational mechanisms; understanding how insects operate and the cognitive processes involved is of interest to neuroscientists and psychologists. Industry: As their interests often align with academics, the novel technologies produced will be of interest to industrial robotic and computer vision applications in natural environments. Insect-inspired control systems are much used in autonomous robotics for autonomous navigation and exploration and in the control of unmanned air vehicles in particular. More generally, route navigation and visual feature selection will impact visual guidance systems (eg for the visually impaired). Finally, the games industry is a major driver of 3D mapping and adaptations of these technologies to large-scale environments will interest them. Through Rothamsted, the application of our work to pollinator flight paths will be of interest to farmers and industry concerned with crop pollination. Agro-chemical industries can assess the impact of altered flight patterns of bees exposed to pesticide (or disease) on the visual information perceived and whether this means, for instance, that they are unable to learn hive position, and plan mitigation strategies accordingly. Beekeepers are always fascinated by how their bees behave and will take great interest in the tools developed and the window they give on the sensory consequences of bee flight. General Public and Schools: The public is fascinated by insects and their lifestyles, and the ways in which we use robots and technology to study them. Moreover, the promise of autonomous travel always sparks interest in the benefits technology can bring to everyday life. Communicating our research will ensure we have an engaged generation who can see the benefits of technology in the study of the natural world. In particular, potential students will see the range of careers available to science students and the links between Biological and Computer Science. They will also see that studying computer science need not result in a career in systems admin, but can involve navigating flying robots and helping understand the brain. The team: The PDRA will gain useful transferable skills and expertise in 3D depth map reconstruction and insect behaviour. Summary: By disseminating our work through high-impact publications and international robotics and biology conferences and hosting specialist workshops, we will impact both UK and international groups.

Publications

10 25 50
 
Description We have developed a set of methods for reconstructing the visual input experienced by insects as they forage for food. These methods have been used to gather data sets from the natural habitats of ants during behavioural experiments so we can interpret insect behaviour. In collaboration with Barbara Webb and Michael Mangan (Edinburgh University) and Wolfgang Sturzl (DLR, Munich) we have used these methods to reconstruct the entire visual history of several ants of the whole course of their life outside the nest.



We have developed a novel insect-inspired algorithm for autonomous route navigation. This is the first complete model of visual route navigation in ants and is a new approach to modelling visual homing and route navigation. Our algorithm robustly navigates routes through complex environments and shows many characteristics of the behaviour of navigating ants. In particular it can also perform place search with the same mechanism and can explain paradigmatic experiments previously used as evidence for 'snapshot' based models. It has been very well-received by the insect-inspired navigation community.



We have analysed a novel innate behaviour: visual scanning in ants. This behaviour prompted our novel navigation model but in turn we have been able to use the model and visual input reconstruction methods to analyse the behavioural consequences of this behaviour in the field.



We have undertaken a detailed analysis of the learning flights of bumblebees. We found that the learning flights are composed of nest-centric loops, while return flights are zigzags. WE showed that these different manoeuvres are variants of one another and that the key point of similarity between the two is when the bee both faces and flies towards the nest. This ties in with our familiarity based model of place homing. We also showed how these nest-centric manoeuvres are tied to geocentric information.



We have started to analyse learning walks in ants using both the familiarity model and visual input reconstruction methods, through a part-EPSRC funded doctoral student. The main finding is that learning walks should be partly tailored to certain distant features of the world, but that they should also have some general features. The next stage of this work is to tie our simulation results to learning walks in different species of ants in different visual environments that we have recorded.
Exploitation Route The navigation algorithms are of potential use in autonomous robotic engineering especially in Agriculture and in particular in the navigation of UAVs in GPS-denied environments. We are currently exploiting these avenues through collaborations with RAL Space and Harper Adams University, through a Newton agritech funded joint project.

There is also potential for the algorithms to be used as navigational aids for visually impaired people.


Our algorithms will be used for schools outreach and widening participation in particular, as well as public engagement The main outcome of our work is to add to the body of knowledge on visual learning and memory.


1) It is adding to our knowledge of what is encoded in a visual memory. Our group has a PhD student who will now try and tie this in to recordings from the insect eye to see what visual features are actually encoded


2) The behaviour is an example of active learning and in particular adds to our knowledge of how innate behaviours can scaffold learning


3) The scanning behaviour will be used to give us a measure of the ant's uncertainty potentially allowing us to investigate Bayesian combining of navigation cues.


4) Our image reconstruction methods and navigation algorithms have also been used both by ourselves and other groups to interpret the behaviour of foraging ants thus giving us a window into the algorithms they might use to navigate.

We have published multiple papers on this work and are planning to use the methods developed as the basis of grant applications (ERC and HFSP applications were submitted but these schemes are extremely competitive and we were unsuccessful). A programme grant to the EPSRC has made it through the first round and is to be submitted March 18, 2016. The work has also prompted new collaborations with Oklahoma University, RAL Space and Harper Adams University as well as strengthening ties with Edinburgh University.

The other major exploitation route is in engineering and robotics. Our navigation algorithms can be used for autonomous robotic navigation and UAVs in particular. We are testing our algorithm in a variety of natural habitats and contexts and using autonomous robots. For instance:



(1) we are developing methods by which a flying agent can navigate based on the view of the ground; We are planning grant applications to follow the work up (EPSRC) in the next 12 months.

(2) we have a prototype of the algorithm that can be run as a mobile phone app which will allow people to recapitulate a path taken by others.

(3) Our familiarity based navigation algorithm could also be applied in a more abstract informational context.


Our algorithms are very simple and attractive to the public and we are using them as the basis of demos for outreach
Sectors Aerospace, Defence and Marine,Agriculture, Food and Drink,Digital/Communication/Information Technologies (including Software),Transport,Other

URL http://www.sussex.ac.uk/lifesci/insectnavigation
 
Description The grant has had significant academic input with multiple articles, citations and over 10 (and counting)presentations at international conferences. In 2014 I was contacted by researchers in RAL Space and Harper Adams University interested in using the algorithms developed in these grants in Space Exploration and Agricultural robots respectively. This has resulted in a joint project funded by the Newton Agritech fund. Beneficiaries: Biologists and biomimetic engineers Contribution Method: Through publications and presentations We have given outreach sessions for schools based on this research and aimed at Widening Participation in particular. We have done ~15 sessions in total reaching 300 students. Beneficiaries: schools children Contribution Method: The research was the basis for the demo sessions we have done
First Year Of Impact 2012
Sector Agriculture, Food and Drink,Digital/Communication/Information Technologies (including Software),Other
Impact Types Societal

 
Description Newton Agritech fund
Amount £70,000 (GBP)
Organisation Science and Technologies Facilities Council (STFC) 
Sector Public
Country United Kingdom
Start 02/2015 
End 03/2016
 
Description Programme grant
Amount £4,816,675 (GBP)
Funding ID EP/P006094/1 
Organisation Engineering and Physical Sciences Research Council (EPSRC) 
Sector Public
Country United Kingdom
Start 12/2016 
End 12/2021
 
Description Seedcorn Research Funding, University of Sussex
Amount £2,500 (GBP)
Organisation University of Sussex 
Sector Academic/University
Country United Kingdom
Start 06/2011 
End 05/2012
 
Description Collaboration with Edinburgh University 
Organisation University of Edinburgh
Country United Kingdom 
Sector Academic/University 
PI Contribution We have several joint publications and have given talks at Edinburgh. We have provided expertise in navigation algorithms and robotics. We have provided simulation code for their publications
Collaborator Contribution We have several joint publications and they have given talks at Sussex. They have given us access to field sites and so we have been able to get new data. They have trialled our algorithms. We have gained expertise in UV filter based vision
Impact Multi-disciplinary collaboration, Computer Science and Biology. Several publications: Cheung, A., Collett, M., Collett, T. S., Dewar, A., Dyer, F., Graham, P, Mangan, M., Narendra, A., Philippides, A., Stürzl, W., Webb, B. Wystrach, A. & Zeil, J. (2014). Still no convincing evidence for cognitive map use by honeybees. PNAS, 111 (42), E4396-E4397 Wystrach, A., Philippides, A., Aurejac, A., Cheng, K.,Graham, P. (2014) Visual scanning behaviours and their role in the navigation of the Australian desert ant Melophorus bagoti, J Comp Physiol A.,1-12 Wystrach, A., Mangan, M., Philippides, A. and Graham, P. (2013) Snapshots in ants? New interpretations of paradigmatic experiments. J Exp Biol, 216:1766-1770
Start Year 2011
 
Description Collaboration with Harper Adams on autonomous agricultural robot navigation 
Organisation Harper Adams University
Country United Kingdom 
Sector Academic/University 
PI Contribution We are partners on a Newton Agritech Fund project and have collaborated on other funding bids, for which my part is insect-inspired navigation algorithms
Collaborator Contribution We are partners on a Newton Agritech Fund project and have collaborated on other funding bids, for which they are dealing with GPS-based navigation and Agri-tech applications for our algorithms. They also wrote a letter of support for a RAEng/Leverhulme fellowship (unsuccessful)
Impact Joint Newton Agritech funded project. Multi-disciplinary: engineering, bio-inspired robotics, Agri-tech
Start Year 2014
 
Description Collaboration with Prof Doug Gaffin 
Organisation University of Oklahoma
Country United States 
Sector Academic/University 
PI Contribution We hosted Prof Doug Gaffin as a visiting professor with us from January-May 2013 providing expertise in insect-inspired navigation algorithms
Collaborator Contribution Prof Doug Gaffin came to be a visiting professor with us from January-May 2013. We got 2000 bench fees plus Doug working free for us for 4 months (estimate 2k/month) but more importantly we have one publication submitted and are planning others. He has also introduced us to engineers at Oklahoma university interested in exploiting the algorithms
Impact one paper submitted, bench fees for Sussex
Start Year 2013
 
Description Collaboration with RAL Space on autonomous robotic for space exploration and AgriTech 
Organisation Rutherford Appleton Laboratory
Department Space Science and Technology Department
Country United Kingdom 
Sector Academic/University 
PI Contribution We are partners on a Newton Agritech Fund project and have collaborated on other unsuccessful funding bids (an ECHORD++ NSTP, Newton). IN all the bids, our part is insect-inspired visual navigation algorithms We are partners on a Newton Agritech Fund project and have collaborated on other funding bids, for which they are dealing with GPS-based navigation and Agri-tech applications for our algorithms. They also wrote a letter of support for a RAEng/Leverhulme fellowship (unsuccessful)
Collaborator Contribution We are partners on a Newton Agritech Fund project (which they are leading on) and have collaborated on other unsuccessful funding bids. Their part is that they provide the robotic platforms and also have helped me a lot with robotics. They also part-funded a trip to China as part of the Netwon project. They have also loaned me a robotic platform to test algorithms In addition, they wrote a letter of support for a RAEng/Leverhulme fellowship (unsuccessful)
Impact Newton Agritech funding. Multi-disciplinary collaboration involving Engineering, specifically Robotics, with Computational Biology and Artificial Life. It will be applied in Agri-tech and space exploration
Start Year 2013
 
Description Schools Outreach and science festivals 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Schools
Results and Impact I have undertaken a variety of schools outreach events based on the work including some talks, but mainly hands-on demonstrations: Can you navigate like an ant?; and robotic demonstrations, especially showcasing interdisciplinary research to school children and focussing on Widening Participation in particular. I have also participated in STEM events such as Big Bang as well as 3 exhibits in the Brighton Science Festival

People seem to be very interested in the cross-disciplinary nature, we have been asked back to many of the Schools (eg Dorothy Stringer, Brighton, Portslade Academy) and, as Admissions Tutor, I have seen applications from these schools, though a causal link is difficult to esetablish
Year(s) Of Engagement Activity 2011,2012,2013,2014,2015,2016
URL http://www.sussex.ac.uk/lifesci/insectnavigation/