ActiveAI - active learning and selective attention for robust, transparent and efficient AI

Lead Research Organisation: University of Sussex
Department Name: Sch of Engineering and Informatics

Abstract

We will bring together world leaders in insect biology and neuroscience with world leaders in biorobotic modelling and computational neuroscience to create a partnership that will be transformative in understanding active learning and selective attention in insects, robots and autonomous systems in artificial intelligence (AI). By considering how brains, behaviours and the environment interact during natural animal behaviour, we will develop new algorithms and methods for rapid, robust and efficient learning for autonomous robotics and AI for dynamic real world applications.

Recent advances in AI and notably in deep learning, have proven incredibly successful in creating solutions to specific complex problems (e.g. beating the best human players at Go, and driving cars through cities). But as we learn more about these approaches, their limitations are becoming more apparent. For instance, deep learning solutions typically need a great deal of computing power, extremely long training times and very large amounts of labeled training data which are simply not available for many tasks. While they are very good at solving specific tasks, they can be quite poor (and unpredictably so) at transferring this knowledge to other, closely related tasks. Finally, scientists and engineers are struggling to understand what their deep learning systems have learned and how well they have learned it.

These limitations are particularly apparent when contrasted to the naturally evolved intelligence of insects. Insects certainly cannot play Go or drive cars, but they are incredibly good at doing what they have evolved to do. For instance, unlike any current AI system, ants learn how to forage effectively with limited computing power provided by their tiny brains and minimal exploration of their world. We argue this difference comes about because natural intelligence is a property of closed loop brain-body-environment interactions. Evolved innate behaviours in concert with specialised sensors and neural circuits extract and encode task-relevant information with maximal efficiency, aided by mechanisms of selective attention that focus learning on task-relevant features. This focus on behaving embodied agents is under-represented in present AI technology but offers solutions to the issues raised above, which can be realised by pursuing research in AI in its original definition: a description and emulation of biological learning and intelligence that both replicates animals' capabilities and sheds light on the biological basis of intelligence.

This endeavour entails studying the workings of the brain in behaving animals as it is crucial to know how neural activity interacts with, and is shaped by, environment, body and behaviour and the interplay with selective attention. These experiments are now possible by combining recent advances in neural recordings of flies and hoverflies which can identify neural markers of selective attention, in combination with virtual reality experiments for ants; techniques pioneered by the Australian team. In combination with verification of emerging hypotheses on large-scale neural models on-board robotic platforms in the real world, an approach pioneered by the UK team, this project represents a unique and timely opportunity to transform our understanding of learning in animals and through this, learning in robots and AI systems.

We will create an interdisciplinary collaborative research environment with a "virtuous cycle" of experiments, analysis and computational and robotic modelling. New findings feed forward and back around this virtuous cycle, each discipline informing the others to yield a functional understanding of how active learning and selective attention enable small-brained insects to learn a complex world. Through this understanding, we will develop ActiveAI algorithms which are efficient in learning and final network configuration, robust to real-world conditions and learn rapidly.

Planned Impact

We will combine expertise in insect neuroscience with biomimetic robotic control to gain a functional understanding of how active learning and selective attention underpin rapid and efficient visual learning. Through this, we will develop ActiveAI algorithms, reinforcement learning methods and artificial neural network (ANN) architectures for robotics and AI that learn rapidly, are computationally efficient, work with limited training data and robust to novel and changing scenarios.

Industrial impact
Our novel sensing, learning and processing algorithms offer impact in robotics/autonomous systems and AI generally. AI problems are currently resolved by increasing computational and training resources. We take a fundamentally different approach using insects as inspiration for efficient algorithms. Here we will develop smart movement patterns which combine with attentional mechanisms to aid information identification/extraction, reducing computational and training loads. We foresee two immediate problem domains in RAS: those where learning speed is highly constrained (e.g. disaster recovery robots, exploration, agri-robotics); and those where computational load and energy usage are limited (e.g. UAVs, agritech, space robotics). Longer term we foresee applications in general AI where a new class of highly efficient and thus scalable ANNs are required to realise grand challenges such as General Intelligence.

We will ensure tight coupling to industrial needs using established industrial members of the Brains on Board advisory board, comprising Dyson, Parrot, NVidia and Google DeepMind as well as collaborators for robotic applications (Harper Adams University, GMV and RALSpace) and Sheffield Robotics contacts (e.g. Amazon, iniVation, Machine With Vision) as well as leveraging new opportunities through both UK and Australian Universities commercialisation operations (Macquarie University's Office of Commercialisation and Innovation Hub; Sussex Research Quality and Impact team, Sussex Innovation Centre; Sheffield Engineering Hub, Sheffield Partnerships and Knowledge Exchange team). Where possible, we will seek to commercialise knowledge through IP licensing, and university supported spin-outs. We already have experience doing so, in particular: optic flow commercialisation through ApisBrain (Marshall); and sensor commercialisation through Skyline Sensors (Mangan). In Sussex, support will be provided by the.

Impact on the team
PDRAs will receive cross-disciplinary training from the UK team in GPU computing, neural simulations, biorobotics and bio-inspired machine learning - very active and rapidly expanding areas and sought after skills in AI and robotics - as well as training from the Australian team in cutting edge neuroscientific methods (electrophysiology and pharmacology combined with virtual reality-enabled behavioural experiments). This will prepare them for careers across academia and industry. In addition, UK co-I Mangan, as an early career researcher, will benefit from support and advice from the senior investigators (UK + Aus), supporting his development as an independent researcher.

Advocacy + general public
We firmly believe in the benefit of ethically aware technology development through responsible innovation. We have already created an ethical code of conduct for the Brains on Board project and engaged with government consultations. We will extend this work and, by promoting and adhering to this philosophy, we will have impact on policy through advocacy and on the general public, through continuation of our extensive public engagement activities e.g. regular public lectures (Cafe Scientifique, Nerd Nite, U3A etc), media appearances (BBC, ABC radio, BBC Southeast) and large outreach events (e.g. Royal Society Science Exhibition 2010, British Science Festival 2017, Brighton Science Festivals, 2010-2018).

Academic Impact
We will impact AI, Robotics and neuroscience (see Academic Beneficiaries).
 
Description Through this award we have established a collaboration between leading insect neurobiologists in Australia with experts in biological modelling and bioinspired AI in the UK. The collaboration has allowed us to find out more about how insects are able to so rapidly and robustly learn visual tasks despite the fact they have tiny brains. For instance, we have shown how hoverflies are able to detect targets in cluttered environments and how what were thought to be cognitive tasks like counting can be achieved by bees through simpler mechanisms when the problem is considered as an active task. We have also developed bio-inspired models of the brain which mimic things we see in insect brains and behaviour, the latter tracked with newly developed tools. Through our spiking neural network tools, these models are being used on-board robots for efficient and accurate visual learning.

In more detail, the outputs break down into 3 major sections
1. Insect intelligence: Our aim was to deepen our understanding of insect behavioural experiments through modelling. Among others, notable findings include :

a) In a collaboration between Sussex and Flinders (Descending neurons of the hoverfly respond to pursuits of artificial target, Current biology, 2023) we analysed electrophysiological responses of target-selective descending neurons in the hoverfly and demonstrated how the responses to simulated target pursuits could partially be predicted by preceding analytical measurements of receptive fields and direction and speed-selectivity of neurons.

b) In computational studies of the fruitfly brain, we found that memory performance could be enhanced through homeostatic mechanisms while another study showed that findings could be explained by a reward prediction error hypothesis

c) A collaboration between Sheffield and Macquarie, we found that Non-numerical strategies can be used by bees to solve numerical cognition tasks and that they can solve Honeybees solve a multi-comparison ranking task by probability matching

d) We have mapped the entire foraging history of individual desert ants which has established how quickly they learn routes, questioned how they learn about their environment, and identified physical movement patterns that can underpin their visual search strategies. The dataset has been made available to the community as a resource for active learning (https://cater.cvmls.org). Data is also published with a 2D environmental reconstruction from which behaviour can be reanalysed with reference to the environment, and other animals which is not possible with standard data-logging techniques.

e) In collaborations with scientists in the UK (due to Covid we re-focussed early experiments we found that hymenopteran visual learning involves multimodal interactions, aversive traces and even (desert ants) and even their own size

2. Efficient and powerful bio-inspired neural models: We have made significant advances in bio-inspired neural models in two key areas:

a) Multiple high-profile papers have shown that multi-scale echo-state models can produce complex behaviour replicating neural dynamics. These models have been applied to visual navigation, a foundational problem for both insects and robots, demonstrating that temporal information could be utilised for this problem making it suitable for robots and hinting at information useful for ant navigation.

b) GENN, our toolbox for running spiking neural network models on GPUs, has been demonstrated in workshops both to our collaborators and other groups, and we have used it to embody models of insect visual learning on-board small robots demonstrating GENNs utility for proving hypotheses on insect learning. In this vein, we are exploring biomimetic algorithms for small target detection in close collaboration with Flinders, where our collaborators test hypotheses in virtual reality experiments in hover flies.

3. We have developed open-source tools for tracking of insects in the field, reconstructing their visual input and putting it through an arbitrary eye-model

a) The CATER (Combined Animal Tracking and Environment Reconstruction) is a general software allowing moving animals to be tracked in video data even when they are small and occluded (not possible with other methods). (https://cater.cvmls.org). CATER has been used at Sussex to map ant trajectories and is currently being used by researchers at Macquarie (Dr Cody Freas) for capturing Australian ant data. We organised a workshop to demonstrate this tool and bring in others working on similar problems

b) The CompoundRay rendering pipeline was developed to model insect vision in high fidelity and at high frame rate. The tool has been made open-source and shown capable of reconstructing realistic 3D eye structures and rendering outputs at 5000fps enabling a new generation of data-driven investigations not possible with hardware or slower models. (https://github.com/BrainsOnBoard/compound-ray). This work has led to collaboration with researchers at Lund & Stockholm Universities to replicate real insect eye data. Researchers at Sussex and Sheffield are working to integrate CompoundRay into emerging industry standard pipelines.
Exploitation Route Neuroscientists and others will use the results on insect learning to better understand small-brained cognition. Likewise bio-inspired AI algorithms will be useful to engineers studying learning in problems with a time-varying component or where computing resource is limited. The latter findings will be of use in industrial contexts particularly in navigation through unstructured or dynamic environments (eg in AgriTech, space, security, search and rescue) and for use on edge devices.

The most notable outlet for industrial impact is via Opteran. Opteran is a spinout of the University using insect brain-derived algorithms to solve challenging problems in the control of robots. The company has already raised ~£12M in private investment, employs around 50 people between its offices in Sheffield and London, and continues to grow. The company is an ideal vehicle for many of the early insights developed in this research to inspire new approaches that can be deployed in commercial robots. Moreover, the company presents a destination for staff looking to move into industry with examples including James Marshall, the Sheffield PI who is a founder of Opteran and seconded there full time, Blayze Millward who worked part-time for 6 months at Opteran during his PhD studies, and Dr Mike Mangan who has successfully received a £1.3M Future Leader Fellowship which will be hosted in the company. Future avenues of collaboration being explored include Opteran supporting research grants, and PhD CASE studentships.
Sectors Aerospace

Defence and Marine

Agriculture

Food and Drink

Digital/Communication/Information Technologies (including Software)

Transport

Other

URL https://github.com/BrainsOnBoard/compound-ray
 
Description Work from the grant has led to a successful FLF bid led by M. Mangan and hosted by Opteran Technologies. Through this grant Opteran will commercialise brain-inspired models for robotics. Opteran has also funded consultancy for A. Philippides on navigation algorithms and we are discussing commercialisation opportunities In addition, we note that the project has had an excellent impact on the Research fellows and research team. Two RFs have faculty positions at Sussex, one more is a postdoc and the other is in industry. The research team has seen two senior moves to Opteran Technologies Limited
First Year Of Impact 2022
Sector Digital/Communication/Information Technologies (including Software),Electronics,Other
Impact Types Economic

 
Description 3B: brains beat brawn
Amount £1,287,730 (GBP)
Funding ID 900305 
Organisation United Kingdom Research and Innovation 
Sector Public
Country United Kingdom
Start 03/2024 
End 02/2028
 
Description Efficient spike-based machine learning on existing HPC hardware
Amount £17,215 (GBP)
Funding ID CPQ-2417168 
Organisation Oracle Corporation 
Sector Private
Country United States
Start 03/2022 
End 04/2023
 
Description Emergent embodied cognition in shallow, biological and artificial, neural networks
Amount £200,036 (GBP)
Funding ID BB/X01343X/1 
Organisation Biotechnology and Biological Sciences Research Council (BBSRC) 
Sector Public
Country United Kingdom
Start 02/2023 
End 08/2024
 
Description Leverhulme Doctoral Scholarships in "be.AI - biomimetic embodied Artificial Intelligence"
Amount £1,350,000 (GBP)
Funding ID DS-2020-065 
Organisation The Leverhulme Trust 
Sector Charity/Non Profit
Country United Kingdom
Start 08/2021 
End 08/2027
 
Description Using Data Driven Artificial Intelligence to Reveal Pesticide Induced Changes in Pollinator Behaviour
Amount £311,449 (GBP)
Organisation University of Sheffield 
Sector Academic/University
Country United Kingdom
Start 02/2024 
End 12/2026
 
Title CATER: Combined Animal Tracking and Environment Reconstruction 
Description A video analysis software for tracking of animals and reconstruction of their environment to allow high precision and high temporal resolution analysis. 
Type Of Material Improvements to research infrastructure 
Year Produced 2023 
Provided To Others? Yes  
Impact The method has already been adopted by a number of other research labs, and featured in at least 1 other paper at the time of writing. 
URL https://www.science.org/doi/10.1126/sciadv.adg2094
 
Title New tool to rapidly and accurately reconstruct compound vision systems 
Description New tool to rapidly and accurately reconstruct compound vision systems. The tool uses modern ray tracing graphics technologies to produce entirely new levels of accuracy. Tool is open sourced via github. 
Type Of Material Technology assay or reagent 
Year Produced 2022 
Provided To Others? Yes  
Impact tba 
URL https://github.com/BrainsOnBoard/compound-ray
 
Title Data for paper: Wood ants learn the magnetic direction of a route but express uncertainty because of competing directional cues 
Description Data for paper published in Journal of Experimental Biology July 2022 The data for each Ant in the experiments described in all but Figure 7 is held in matlab files with the name as follows: AntU_LN22WESTtest_1522_31072019_Published.mat The data for the experiments with two triangles is held in the zip file TrianglesData.zip which has individual files in the same format as above There is a detailed description of the variables in the file ants_magnets_philippides_dataset_description.pdf Abstract Wood ants were trained indoors to follow a magnetically specified route that went from the centre of an arena to a drop of sucrose at the edge. The arena, placed in a white cylinder, was in the centre of a 3D coil system generating an inclined Earth-strength magnetic field in any horizontal direction. The specified direction was rotated between each trial. The ants' knowledge of the route was tested in trials without food. Tests given early in the day, before any training, show that ants remember the magnetic route direction overnight. During the first 2 seconds of a test, ants mostly faced in the specified direction, but thereafter were often misdirected, with a tendency to face briefly in the opposite direction. Uncertainty about the correct path to take may stem in part from competing directional cues linked to the room. In addition to facing along the route, there is evidence that ants develop magnetically directed home and food vectors dependent upon path integration. A second experiment asked whether ants can use magnetic information contextually. In contrast to honeybees given a similar task, ants failed this test. Overall, we conclude that magnetic directional cues can be sufficient for route learning. 
Type Of Material Database/Collection of data 
Year Produced 2022 
Provided To Others? Yes  
Impact the data supported the results in the paper 
URL https://sussex.figshare.com/articles/dataset/Data_for_paper_Wood_ants_learn_the_magnetic_direction_o...
 
Title Dataset for paper "mlGeNN: Accelerating SNN inference using GPU-Enabled Neural Networks" 
Description Dataset for paper accepted in IOP Neuromorphic Computing and Engineering March 2022Dataset contains trained weights from TensorFlow 2.4.0 for the following models:- vgg16_imagenet_tf_weights.h5 - VGG-16 model trained on ImageNet ILSVRC dataset - vgg16_tf_weights.h5 - VGG-16 model trained on CIFAR-10 dataset- resnet20_cifar10_tf_weights.h5 - ResNet-20 model trained on CIFAR-10 dataset- resnet34_imagenet_tf_weights.h5 - ResNet-34 model trained on ImageNet ILSVRCAbstract"In this paper we present mlGeNN - a Python library for the conversion of artificial neural networks (ANNs) specified in Keras to spiking neural networks (SNNs). SNNs are simulated using GeNN with extensions to efficiently support convolutional connectivity and batching. We evaluate converted SNNs on CIFAR-10 and ImageNet classification tasks and compare the performance to both the original ANNs and other SNN simulators. We find that performing inference using a VGG-16 model, trained on the CIFAR-10 dataset, is 2.5x faster than BindsNet and, when using a ResNet-20 model trained on CIFAR-10 with FewSpike ANN to SNN conversion, mlGeNN is only a little over 2x slower than TensorFlow."FundingBrains on Board grant number EP/P006094/1ActiveAI grant number EP/S030964/1Unlocking spiking neural networks for machine learning research grant number EP/V052241/1European Union's Horizon 2020 research and innovation program under Grant Agreement 945539 
Type Of Material Database/Collection of data 
Year Produced 2022 
Provided To Others? Yes  
Impact the data supported the results in the paper 
URL https://sussex.figshare.com/articles/dataset/Dataset_for_paper_mlGeNN_Accelerating_SNN_inference_usi...
 
Title Desert Ant Ontogeny Dataset 
Description The entire foraging life of desert ant documented in a series of videos, provided with the tracking software, the tracks, and an environment reconstruction 
Type Of Material Database/Collection of data 
Year Produced 2023 
Provided To Others? Yes  
Impact New insights were made in the publication associated with the dataset that are driving new research questions. 
URL https://cater.cvmls.org/
 
Title Hoverfly (Eristalis tenax) descending neurons respond to pursuits of artificial targets 
Description Many animals use motion vision information to control dynamic behaviors. Predatory animals, for example, show an exquisite ability to detect rapidly moving prey followed by pursuit and capture. Such target detection is not only used by predators but can also play an important role in conspecific interactions. Male hoverflies (Eristalis tenax), for example, vigorously defend their territories against conspecific intruders. Visual target detection is believed to be subserved by specialized target-tuned neurons that are found in a range of species, including vertebrates and arthropods. However, how these target-tuned neurons respond to actual pursuit trajectories is currently not well understood. To redress this, we recorded extracellularly from target selective descending neurons (TSDNs) in male Eristalis tenax hoverflies. We show that the neurons have dorso-frontal receptive fields, with a preferred direction up and away from the visual midline, with a clear division into a TSDNLeft and a TSDNRight cluster. We next reconstructed visual flow-fields as experienced during pursuits of artificial targets (black beads). We recorded TSDN responses to six reconstructed pursuits and found that each neuron responded consistently at remarkably specific time points, but that these time points differed between neurons. We found that the observed spike probability was correlated with the spike probability predicted from each neuron's receptive field and size tuning. Interestingly, however, the overall response rate was low, with individual neurons responding to only a small part of each reconstructed pursuit. In contrast, the TSDNLeft and TSDNRight populations responded to substantially larger proportions of the pursuits, but with lower probability. This large variation between neurons could be useful if different neurons control different parts of the behavioral output. 
Type Of Material Database/Collection of data 
Year Produced 2023 
Provided To Others? Yes  
Impact Current biology paper 
URL https://datadryad.org/stash/dataset/doi:10.5061/dryad.tdz08kq4d
 
Title Neuromorphic sequence learning with an event camera on routes through vegetation 
Description code and dataset for paper 'Neuromorphic sequence learning with an event camera on routes through vegetation'. 
Type Of Material Database/Collection of data 
Year Produced 2023 
Provided To Others? Yes  
Impact Publication in Science Advances, garnering good press coverage and follow-on work 
URL https://zenodo.org/record/8289546
 
Title Research data for paper "Geosmin suppresses defensive behaviour and elicits unusual neural responses in honey bees" 
Description Research data for paper published in Scientific Reports journal March 2023 This data explores the effects of Geosmin on honeybees and has three separate parts: 1. Behavioural data on the stinging response of bees towards a dummy in the presence of the alarm pheromone and Geosmin 2. Electro-antennogram (EAG) data of the response to Geosmin on bees' antenna 3. Calcium imaging data illustrating the neural response to Geosmin in the honey bee antennal lobe. Data & File Overview File List: Bee_Aggression_Behaviour.csv: Bees' stinging responses towards a rotating dummy Bee_AL_Calcium_imaging.csv: Calcium imaging time traces for individual odour stimuli and glomeruli in all bees Bee_EAG_Data.csv: Electroantennography response amplitudes for individual odour stimuli in all bees Methodological Information all Methods described in: Scarano F, Deivarajan Suresh M, Tiraboschi E, Cabirol A, Nouvian M, Nowotny T, Haase A. Geosmin suppresses defensive behaviour and elicits unusual neural responses in honey bees. Sci Rep 13:3851 (2023) Data-specific Information for: 'Bee_Aggression_Behaviour.csv' The data file contains observational data from an aggression assay, where 325 bee dyads were inserted into an arena with a rotating dummy. Data indicates whether a behaviour was observed or not from one or both bees. Further details in the Methods sections of Scarano et al. 2021 Number of variables/columns: 17 Number of cases/rows: 324 Variable List: A) Day: Date in which bees were tested, DD/MM/YYYY B) Weather: Describes the weather on the trial date, Sunny / Cloudy C) Hive: Hive colour recognised by the paint colour from which the bees have been acquired, Orange / White / Green / Yellow D) Dummy: Which of two arenas used in that behavioural trial, A / B E) Group: Represents the odours - VOCs and its mixtures with the respective concentrations, IAA / Geosmin6 / Geosmin3 / IAAGeo3 / IAAGeo6 / Control, Concentrations: 3 (10^-3) & 6 (10^-6) F) Sting.Bee1: Represents if bee one exhibits stinging behaviour, 0 (no) / 1 (yes), binary G) Sting.Bee2: Represents if bee two exhibits stinging behaviour, 0 (no) / 1 (yes), binary H) Recruit.Bee1: Represents if bee one exhibits recruiting behaviour, 0 (no) / 1 (yes), binary I) Recruit.Bee2: Represents if bee two exhibits recruiting behaviour, 0 (no) / 1 (yes), binary J) Grooming/Calm: Represents if the bees exhibit grooming or calm behaviour, 0 (none) /1 /2, number of bees K) Batch: Represents in which time batch the trials were conducted, 1 (Morning) / 2 (Afternoon), numerical - time period L) number.sting: Represents number of bees stinging in a trial, 0/1/2, number of bees M) number.recruit: Represents number of bees recruiting in a trial, 0/1/2, number of bees N) sting.first: Represents if the stinging behaviour was exhibited first in a trial, 0 (no) / 1 (yes), binary O) recruit.first: Represents if the recruiting behaviour was exhibited first in a trial, 0 (no) / 1 (yes), binary P) b.sting: Represents if there was any stinging behaviour exhibited by the two bees during that trial, 0 (no) / 1 (yes), binary Q) b.recruit: Represents if there was any recruiting behaviour exhibited by the two bees during that trial, 0 (no) / 1 (yes), binary Data-specific Information for: 'Bee_EAG_Data.csv' Data are electroantennography responses for 24 antennas exposed to different odour stimuli. Data points are voltage change amplitudes with respect to the baseline, averaged over the 1s stimulus duration. Further datails in the Methods sections of Scarano et al. 2021 Number of variables/columns: 16 Number of cases/rows: 24 Variable List: A) Bee: subject numbers B-P) Responses to different Odours and Concentrations Values are average potential change (over 10 repetitions) in response to the presented odour stimuli in units of Volt Odours presented: Control = pure mineral oil (solvent) Geo = geosmin in mineral oil at a concentration indicated in brackets IAA = Isoamil acetate in mineral oil at a concentration indicated in brackets B) Control mineral oil C) Geo (10^-6) D) Geo (10^-5) E) Geo (10^-4) F) Geo (10^-3) G) IAA (10^-3) H) IAA (10^-3)+ Geo (10^-6) I) IAA (10^-3)+ Geo (10^-5) J) IAA (10^-3)+ Geo (10^-4) K) IAA (10^-3)+ Geo (10^-3) L) IAA (10^-1) M) IAA (10^-1)+ Geo (10^-6) N) IAA (10^-1)+ Geo (10^-5) O) IAA (10^-1)+ Geo (10^-4) P) IAA (10^-1)+ Geo (10^-3) Data-specific Information for: 'Bee_AL_Calcium_imaging.csv' The data file contains response curve from up to 19 glomeruli in the antennal lobe of 14 bees, data are relative changes in fluorescence averaged over the glomerular area. The fluorescence changes stem from projection neurons that were stained by backfill injection with the calcium-sensitive dye fura-dextrane. Further details in Methods sections of Scarano et al. 2021 Number of variables/columns: 95 Number of cases/rows: 1720 Variable List: A) Bee-id: Subject number B) Glo_id: Glomerulus number following bee antennal lobe standard atlas nomenclature for tract T1 C) Odour: Stimulus odour type Geo6 = Geosmin at concentration 10^-6 in mineral oil Geo3 = Geosmin at concentration 10^-3 in mineral oil IAA = Isoamyl acetate at concentration 10^-1 in mineral oil 3Hex = 3-hexanol at concentration 5x10^-3 in mineral oil acetoph = Acetophenone at concentration 5x10^-3 in mineral oil non = 1-nonanol at concentration 5x10^-3 in mineral oil IAA-Glo are mixtures of both odours D) Response: Automatized response classification (1 activated, 2 background activity, 2 inhibited) E-CQ) Frame_1-91: Glomerular response curves, 91 frames, 10.033 frames/s, Curves cover a 1s prestimulus interval (frames 1-30), 1s stimulus (frames 31-60), and 1s post stimulus (frames 61-91). Values are fluorescence changes in percent, averaged over the glomerular area, background subtracted and normalized with respect to the background. Article Abstract Geosmin is an odorant produced by bacteria in moist soil. It has been found to be extraordinarily relevant to some insects, but the reasons for this are not yet fully understood. Here we report the first tests of the effect of geosmin on honey bees. A stinging assay showed that the defensive behaviour elicited by the bee's alarm pheromone component isoamyl acetate (IAA) is strongly suppressed by geosmin. Surprisingly, the suppression is, however, only present at very low geosmin concentrations, and disappears at higher concentrations. We investigated the underlying mechanisms at the level of the olfactory receptor neurons by means of electroantennography, finding the responses to mixtures of geosmin and IAA to be lower than to pure IAA, suggesting an interaction of both compounds at the olfactory receptor level. Calcium imaging of the antennal lobe (AL) revealed that neuronal responses to geosmin decreased with increasing concentration, correlating well with the observed behaviour. Computational modelling of odour transduction and coding in the AL suggests that a broader activation of olfactory receptor types by geosmin in combination with lateral inhibition could lead to the observed non-monotonic increasing-decreasing responses to geosmin and thus underlie the specificity of the behavioural response to low geosmin concentrations. Links to other resources relating to the data Computational modelling software that aims at reproducing the Calicum and EAG data indcluded here, is available on https://github.com/tnowotny/bee_al_2021 
Type Of Material Database/Collection of data 
Year Produced 2023 
Provided To Others? Yes  
Impact the data supported the results in the paper 
URL https://sussex.figshare.com/articles/dataset/Research_data_for_paper_Geosmin_suppresses_defensive_be...
 
Title Stanmer Park outdoor navigational data 
Description This dataset contains omnidirectional 1440?1440 resolution images taken using a Kodak Pixpro SP360 camera paired with RTK GPS information obtained using a simple RTK2B - 4G NTRIP kit and fused yaw, pitch and roll data recorded from a BNO055 IMU. The data was collected using a 4 wheel ground robot that was manually controlled by a human operator. The robot was driven 15 times along a route at Stanmer Park (shown in map.png). The route consists mostly of open fields and a narrow path through a forest and is approximately 700m long. The recordings took place at various days and times starting in March 2021, with the date and time indicated by the filename. For example '20210420_135721.zip' corresponds to a route driven on 20/03/2021 starting at 13:57:21 GMT. During the recordings the weather varied from clear skies and sunny days to overcast and low light conditions. Each recording consists of an mp4 video of the camera footage for the route, and a database_entries.csv file with the following columns:Timestamp of video frame (in ms)X, Y and Z coordinate (in mm) and zone representing location in UTM coordinates from GPSHeading, pitch and roll (in degrees) from IMU. In some early routes, the IMU failed and when this occurs these values are recorded as "NaN".Speed and Steering angle commands being sent to robot at that timeGPS quality (1=GPS, 2=DGNSS, 4=RTK Fixed and 5=RTK Float)X, Y and Z coordinates (in mm) fitted to a degree one polynomial to smooth out GPS noiseHeading (in degrees) derived from smoothed GPS coordinatesIMU heading (in degrees) with discontinuities resulting from IMU issues fixedFor completeness, each folder also contains a database_entries_original.csv containing the data before pre-processing. The pre-processing is documented in more detail in pre_processing_notes.pdf. 
Type Of Material Database/Collection of data 
Year Produced 2024 
Provided To Others? Yes  
Impact Conference paper in preparation 
URL https://sussex.figshare.com/articles/dataset/Stanmer_Park_outdoor_navigational_data/25118383
 
Title UoS campus and Stanmer park outdoor navigational data 
Description This dataset contains omnidirectional 1440?1440 resolution images taken using a Kodak Pixpro SP360 camera paired with RTK GPS information obtained using a simple RTK2B - 4G NTRIP kit and fused yaw, pitch and roll data recorded from a BNO055 IMU. The data was collected using a 4 wheel ground robot (SuperDroid IG42-SB4-T) that was manually controlled by a human operator. The robot was driven 10 times along a route on the University of Sussex campus (shown in campus.png) and 10 times at the adjacent Stanmer Park (shown in stanmer.png). The first route is a mix of urban structures (university buildings), small patches of trees and paths populated by people and is approximately 700m long. The second route consists mostly of open fields and a narrow path through a forest and is approximately 600m long. The recordings took place at various days and times starting in May 2023, with the date and time indicated by the filename. For example 'campus_route5_2023_11_22_102925.zip' corresponds to the 5th route recorded on the Sussex campus on 22/11/2023 starting at 10:29:25 GMT. During the recordings the weather varied from clear skies and sunny days to overcast and low light conditions. Each recording consists of the .jpg files that make up the route, and a .csv file with the following columns:X, Y and Z coordinate (in mm) and zone representing location in UTM coordinates from GPSHeading, pitch and roll (in degrees) from IMU. In some early routes, the IMU failed and when this occurs these values are recorded as "NaN".Filename of corresponding camera imageLatitude (in decimal degrees west) and Longitude (in decimal degrees north) and Altitude (in m) from GPSGPS quality (1=GPS, 2=DGNSS, 4=RTK Fixed and 5=RTK Float) horizontal dilation (in mm)Timestamp (in ms) 
Type Of Material Database/Collection of data 
Year Produced 2023 
Provided To Others? Yes  
Impact Conference paper in preparation 
URL https://sussex.figshare.com/articles/dataset/UoS_campus_and_Stanmer_park_outdoor_navigational_data/2...
 
Description Bruno van Swinderen 
Organisation University of Queensland
Country Australia 
Sector Academic/University 
PI Contribution They are project partners on the ActiveAI grant. We will have reciprocal visits of PIs and Postdocs when they are allowed. We are helping them with modelling and computational analysis of data.
Collaborator Contribution They are project partners on the ActiveAI grant. We will have reciprocal visits of PIs and Postdocs when they are allowed. They are providing us with results from insect experiments which will inform our robotic models
Impact the collaboration is multi-disciplinary: we do robotics and computational modelling, they do insect neuroscience
Start Year 2019
 
Description Collaboration project with Dr. Cywn Solve (Macquarie University) to investigate impact of brain size on learning speed in temporal learning tasks. 
Organisation Macquarie University
Country Australia 
Sector Academic/University 
PI Contribution AO, EV, MM and LM developed computational models that supported data captured from the collaborators in bumble bees and hummingbirds. All contributed to a paper submitted to Science in 2021
Collaborator Contribution Collaborators provided behavioural data related to learning speeds in two species of bee and hummingbirds. Wrote collaborative paper.
Impact Redrafting paper following feedback from Science. Multidisciplinary research with Sheffield providing the machine learning / computational neuroscience expertise and our Australian partners the behavioral and neuroscience expertise
Start Year 2021
 
Description Collaboration with Dr Nicholas Szczecinski (West Virginia University) 
Organisation West Virginia University
Country United States 
Sector Academic/University 
PI Contribution I collaborated with Dr Szczecinski to propose and run a workshop at the 10th Anniversary Living Machines Conference. INVERTEBRATE ROBOTICS, AKA "NO BACKBONE, NO PROBLEM" - 29 JULY 2021. We then further collaborated on a review paper that summarised the key themes of the workshop.
Collaborator Contribution I collaborated with Dr Szczecinski to propose and run a workshop at the 10th Anniversary Living Machines Conference. INVERTEBRATE ROBOTICS, AKA "NO BACKBONE, NO PROBLEM" - 29 JULY 2021. We then further collaborated on a review paper that summarised the key themes of the workshop.
Impact The workshop that we ran was a great success at the conference which was held during COVID lockdown. The summary outcomes were presented in a review paper published in Bioinspiration & Biomimetics in 2023.
Start Year 2021
 
Description Collaboration with Macquarie University 
Organisation Macquarie University
Country Australia 
Sector Academic/University 
PI Contribution They are project partners on the ActiveAI grant. We will have reciprocal visits of PIs and Postdocs when they are allowed. We are helping them with modelling and computational analysis of data.
Collaborator Contribution They are project partners on the ActiveAI grant. We will have reciprocal visits of PIs and Postdocs when they are allowed. They are providing us with results from insect experiments which will inform our robotic models
Impact the collaboration is multi-disciplinary: we do robotics and computational modelling, they do insect neuroscience
Start Year 2019
 
Description Karin Nordstrom 
Organisation Flinders University
Country Australia 
Sector Academic/University 
PI Contribution They are project partners on the ActiveAI grant. We will have reciprocal visits of PIs and Postdocs when they are allowed. We are helping them with modelling and computational analysis of data.
Collaborator Contribution They are project partners on the ActiveAI grant. We will have reciprocal visits of PIs and Postdocs when they are allowed. They are providing us with results from insect experiments which will inform our robotic models
Impact the collaboration is multi-disciplinary: we do robotics and computational modelling, they do insect neuroscience
Start Year 2019
 
Description MM and AP new collaboration on novel bee tracking methods with Dr Mike Smith (University of Sheffield, UK) 
Organisation University of Sheffield
Department Department of Computer Science
Country United Kingdom 
Sector Academic/University 
PI Contribution Aiding design of new tracking methods, experimental design, establishing animal tracking network, co-supervision of PhDs
Collaborator Contribution Aiding design of new tracking methods, experimental design, establishing animal tracking network, co-supervision of PhDs
Impact TBA
Start Year 2023
 
Title CATER: combined animal tracking and environment reconstruction 
Description Method to extract animal positions from video data and to embed those tracks in a reconstructed background allowing high spatiotemporal analysis of animal behaviour in the wild and with reference to the habitat. 
Type Of Technology New/Improved Technique/Technology 
Year Produced 2024 
Open Source License? Yes  
Impact Insights raised from analysis linked to methods paper Multiple lab groups now using the tool New studies have been inspired by these outcomes. 
URL https://www.science.org/doi/10.1126/sciadv.adg2094
 
Title Code related to paper - EchoVPR: Echo State Networks for Visual Place Recognition 
Description Code to train and apply Echo-State-Networks to the problem of visual place recognition. 
Type Of Technology New/Improved Technique/Technology 
Year Produced 2022 
Open Source License? Yes  
Impact Acceptance to leading robotics conference ICRA 2022, and publication in leading robotics journal Robotics and Automation Letters. 
URL https://anilozdemir.github.io/EchoVPR/
 
Title CompoundRay: An open-source tool for high-speed and high-fidelity rendering of compound eye 
Description CompoundRay is new open-source renderer that accurately renders the visual perspective of insect eyes at over 5,000 frames per second in a 3D mapped natural environment. It supports ommatidial arrangements at arbitrary positions with per-ommatidial heterogeneity. 
Type Of Technology New/Improved Technique/Technology 
Year Produced 2021 
Impact The basis for investigation of insect shape on visual homing tasks - Blayze Millward. A new collaboration with researchers at Flinders University (Dr Karin Nordstrum) A new collaboration with researchers at DeepMind (Dr Chrisantha Fernando) 
URL https://www.biorxiv.org/content/10.1101/2021.09.20.461066v1
 
Description Andy Phillipides was interviewed by BBC south east about a 'robotic' grape harvester 
Form Of Engagement Activity A press release, press conference or response to a media enquiry/interview
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Public/other audiences
Results and Impact TV interview with local media to discuss a 'robotic' grape harvester
Year(s) Of Engagement Activity 2020
 
Description Article in The Times following AAAS 2020 
Form Of Engagement Activity A press release, press conference or response to a media enquiry/interview
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact The Times published an article on the project 'Bees help drones to find their bearings', on 17/02/2020, following on from us exhibiting at AAAS 2020 with URKI on 15/02/2020.
Year(s) Of Engagement Activity 2020
URL https://www.thetimes.co.uk/article/bees-help-drones-to-find-their-bearings-jfnfgs8x2
 
Description Article published by The Telegraph following on from AAAS 2020 
Form Of Engagement Activity A press release, press conference or response to a media enquiry/interview
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact The Telegraph published an article on the project 'Bees are being mapped to help develop driverless cars and drones by scientists glueing tiny antennas to their heads', on 17/02/2020, following on from us exhibiting at AAAS 2020 with URKI on 15/02/2020.
Year(s) Of Engagement Activity 2020
URL https://www.telegraph.co.uk/news/2020/02/17/bees-mapped-help-develop-driverless-cars-drones-scientis...
 
Description Catalan government AI funding report 
Form Of Engagement Activity A formal working group, expert panel or dialogue
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Policymakers/politicians
Results and Impact I contributed to a report on AI funding for digital innovation hubs presented to the Barcelona Chamber of commerce and the Catalan Regional Government
Year(s) Of Engagement Activity 2021
 
Description Exhibit at AAAS 2020 International Reception hosted by UKRI 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Industry/Business
Results and Impact The project was invited by UKRI to showcase at their VIP International Reception, hosted as part of AAAS 2020 in Seattle, February 2020. James Marshall, Alex Cope, Jamie Knight and Joe Woodgate exhibited the project,providing an overview of the research and technology emerging from the project, and demonstrations of drone and ground-based robotics. This resulted in significant international media coverage of the project.
Year(s) Of Engagement Activity 2020
URL https://www.ukri.org/aaas/
 
Description Financial Times article following AAAS 2020 
Form Of Engagement Activity A press release, press conference or response to a media enquiry/interview
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact The Financial Times published an article on the project 'Scientists look to bees to develop drone technology', on 17/02/2020, following on from us exhibiting at AAAS 2020 with URKI on 15/02/2020.
Year(s) Of Engagement Activity 2020
URL https://www.ft.com/content/bf3c83fe-5081-11ea-8841-482eed0038b1
 
Description Interview with Sky News following AAAS 2020 
Form Of Engagement Activity A press release, press conference or response to a media enquiry/interview
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact James Marshall was interviewed about the project by Sky News on 18/02/2020, following on the project exhibiting with UKRI at AAAS 2020, 15/02/2020.
Year(s) Of Engagement Activity 2020
 
Description Invited speaker at IROS2023 workshop: Closing the Loop on Localization: What Are We Localizing For, and How Does That Shape Everything We Should Do 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Workshop at top robotics conference (audience 100+) that included academic and industrial researchers, robotics companies, funders etc. Significant interest in the alternative brain-based approach to autonomy that was proposed, and followup commercial opportunities initiated.
Year(s) Of Engagement Activity 2023
URL https://oravus.github.io/vpr-workshop/
 
Description MM gave keynote talk at UKRAS 2024 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Presented natural intelligence approach as an alternative to AI to 300+ robotics reserachers, industry, funders, policymakers etc at the IEEE UKRAS conference 2024.
Year(s) Of Engagement Activity 2024
URL https://www.sheffield.ac.uk/sheffieldrobotics/7th-ieee-uk-ireland-ras-conference-ras-2024#:~:text=Ta...
 
Description MM gave talk at European Robotics Forum - Bioinspired Robotics Topic Group 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Gave a presentation about natural intelligence approach at the European Robotics Forum topic group session on bioinspired robotics.
Year(s) Of Engagement Activity 2023
URL https://erf2023.sdu.dk/
 
Description MM organised 1st workshop on small animal tracking. AP and JK gave talks 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Workshop bringing together researchers, students, research software engineer and businesses interested small animal tracking from across the EU. Multiple follow on funding and outreach activities discussed and to follow.
Year(s) Of Engagement Activity 2024
 
Description NATO Autonomy Workshop 
Form Of Engagement Activity A formal working group, expert panel or dialogue
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Policymakers/politicians
Results and Impact Four members of the project team (James Marshall, Alex Cope, Joe Woodgate & Jamie Knight) attended as invited speakers and panellists on a workshop provided by NATO.
Year(s) Of Engagement Activity 2020
 
Description New Scientist Comment by Prof. James Marshall 
Form Of Engagement Activity A magazine, newsletter or online publication
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Public/other audiences
Results and Impact Comment piece in the 'Life' section of the publication by Prof' James Marshall.
Year(s) Of Engagement Activity 2021
URL https://www.newscientist.com/article/mg24933220-100-insect-brains-will-teach-us-how-to-make-truly-in...
 
Description Organised workshop at Living Machines Conference: INVERTEBRATE ROBOTICS, AKA "NO BACKBONE, NO PROBLEM" 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Workshop on INVERTEBRATE ROBOTICS, AKA "NO BACKBONE, NO PROBLEM" as part of the 10th anniversary Living Machines conference series. Mainly to other academics, inspired a lively discussion and a follow on review paper.
Year(s) Of Engagement Activity 2021
URL https://livingmachinesconference.eu/2021/conference/invertebrate-robotics/
 
Description Organiser and Chair of INVITED SYMPOSIUM - NEW TOOLS TO STUDY BEHAVIOUR IN THE FIELD: INSIGHTS FROM INSECT NAVIGATION at ICN 2022 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Proposed, organised and chaired workshop on NEW TOOLS TO STUDY BEHAVIOUR IN THE FIELD: INSIGHTS FROM INSECT NAVIGATION at the International Congress of Neuroethology, 2022. Very well attended workshop with lively debate. Discussions afterwards on how biologists and engineers could collaborate more on future research projects. Also, ECRs selected to speak giving career opportunnties.
Year(s) Of Engagement Activity 2022
URL https://www.neuroethology.org/Meetings
 
Description PIP talk on AI 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach Local
Primary Audience Public/other audiences
Results and Impact I gave a public engagement talk on the imapct of AI to a (mainly local) group
Year(s) Of Engagement Activity 2021
 
Description RSS2023 Workshop on Rapid and Robust Robotic Active Learning (R3AL): AP/DS organisers, MM invited speaker 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact International researchers attended workshop on rapid learning in robots, with panel discussion and extended debate.
Year(s) Of Engagement Activity 2023
URL https://r3al.sdu.dk/category/uncategorized/
 
Description Speaker at 60 years of Sussex Research Partnership Conference 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Other audiences
Results and Impact I was selected to present the research of Brains on Board and ActiveAI as part of the 60 years of Sussex Research Partnership Conference to highlight academice partnerships
Year(s) Of Engagement Activity 2022
URL https://www.sussex.ac.uk/about/60-years-of-sussex/news-and-events?id=57433
 
Description UKRI-BBSRC Expert Working Group on the Use of Models in research 
Form Of Engagement Activity A formal working group, expert panel or dialogue
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Policymakers/politicians
Results and Impact I was a member of the UKRI-BBSRC Expert Working Group on the Use of Models in research. We debated the subject which was made into a report which we fed back one
Year(s) Of Engagement Activity 2021
 
Description Virtual Insect Navigation Workshop Aug 4th-6th 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Presentation at the Virtual Insect Navigation Workshop Aug 4th-6th
Year(s) Of Engagement Activity 2020
 
Description work featured on Youtube on scishow 
Form Of Engagement Activity A broadcast e.g. TV/radio/film/podcast (other than news/press)
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact My bumblebee research was focussed on in an episode of Scishow on Youtube
Year(s) Of Engagement Activity 2021
URL https://www.youtube.com/watch?v=qqIPe3Ya8y0