ActiveAI - active learning and selective attention for robust, transparent and efficient AI

Lead Research Organisation: University of Sussex
Department Name: Sch of Engineering and Informatics

Abstract

We will bring together world leaders in insect biology and neuroscience with world leaders in biorobotic modelling and computational neuroscience to create a partnership that will be transformative in understanding active learning and selective attention in insects, robots and autonomous systems in artificial intelligence (AI). By considering how brains, behaviours and the environment interact during natural animal behaviour, we will develop new algorithms and methods for rapid, robust and efficient learning for autonomous robotics and AI for dynamic real world applications.

Recent advances in AI and notably in deep learning, have proven incredibly successful in creating solutions to specific complex problems (e.g. beating the best human players at Go, and driving cars through cities). But as we learn more about these approaches, their limitations are becoming more apparent. For instance, deep learning solutions typically need a great deal of computing power, extremely long training times and very large amounts of labeled training data which are simply not available for many tasks. While they are very good at solving specific tasks, they can be quite poor (and unpredictably so) at transferring this knowledge to other, closely related tasks. Finally, scientists and engineers are struggling to understand what their deep learning systems have learned and how well they have learned it.

These limitations are particularly apparent when contrasted to the naturally evolved intelligence of insects. Insects certainly cannot play Go or drive cars, but they are incredibly good at doing what they have evolved to do. For instance, unlike any current AI system, ants learn how to forage effectively with limited computing power provided by their tiny brains and minimal exploration of their world. We argue this difference comes about because natural intelligence is a property of closed loop brain-body-environment interactions. Evolved innate behaviours in concert with specialised sensors and neural circuits extract and encode task-relevant information with maximal efficiency, aided by mechanisms of selective attention that focus learning on task-relevant features. This focus on behaving embodied agents is under-represented in present AI technology but offers solutions to the issues raised above, which can be realised by pursuing research in AI in its original definition: a description and emulation of biological learning and intelligence that both replicates animals' capabilities and sheds light on the biological basis of intelligence.

This endeavour entails studying the workings of the brain in behaving animals as it is crucial to know how neural activity interacts with, and is shaped by, environment, body and behaviour and the interplay with selective attention. These experiments are now possible by combining recent advances in neural recordings of flies and hoverflies which can identify neural markers of selective attention, in combination with virtual reality experiments for ants; techniques pioneered by the Australian team. In combination with verification of emerging hypotheses on large-scale neural models on-board robotic platforms in the real world, an approach pioneered by the UK team, this project represents a unique and timely opportunity to transform our understanding of learning in animals and through this, learning in robots and AI systems.

We will create an interdisciplinary collaborative research environment with a "virtuous cycle" of experiments, analysis and computational and robotic modelling. New findings feed forward and back around this virtuous cycle, each discipline informing the others to yield a functional understanding of how active learning and selective attention enable small-brained insects to learn a complex world. Through this understanding, we will develop ActiveAI algorithms which are efficient in learning and final network configuration, robust to real-world conditions and learn rapidly.

Planned Impact

We will combine expertise in insect neuroscience with biomimetic robotic control to gain a functional understanding of how active learning and selective attention underpin rapid and efficient visual learning. Through this, we will develop ActiveAI algorithms, reinforcement learning methods and artificial neural network (ANN) architectures for robotics and AI that learn rapidly, are computationally efficient, work with limited training data and robust to novel and changing scenarios.

Industrial impact
Our novel sensing, learning and processing algorithms offer impact in robotics/autonomous systems and AI generally. AI problems are currently resolved by increasing computational and training resources. We take a fundamentally different approach using insects as inspiration for efficient algorithms. Here we will develop smart movement patterns which combine with attentional mechanisms to aid information identification/extraction, reducing computational and training loads. We foresee two immediate problem domains in RAS: those where learning speed is highly constrained (e.g. disaster recovery robots, exploration, agri-robotics); and those where computational load and energy usage are limited (e.g. UAVs, agritech, space robotics). Longer term we foresee applications in general AI where a new class of highly efficient and thus scalable ANNs are required to realise grand challenges such as General Intelligence.

We will ensure tight coupling to industrial needs using established industrial members of the Brains on Board advisory board, comprising Dyson, Parrot, NVidia and Google DeepMind as well as collaborators for robotic applications (Harper Adams University, GMV and RALSpace) and Sheffield Robotics contacts (e.g. Amazon, iniVation, Machine With Vision) as well as leveraging new opportunities through both UK and Australian Universities commercialisation operations (Macquarie University's Office of Commercialisation and Innovation Hub; Sussex Research Quality and Impact team, Sussex Innovation Centre; Sheffield Engineering Hub, Sheffield Partnerships and Knowledge Exchange team). Where possible, we will seek to commercialise knowledge through IP licensing, and university supported spin-outs. We already have experience doing so, in particular: optic flow commercialisation through ApisBrain (Marshall); and sensor commercialisation through Skyline Sensors (Mangan). In Sussex, support will be provided by the.

Impact on the team
PDRAs will receive cross-disciplinary training from the UK team in GPU computing, neural simulations, biorobotics and bio-inspired machine learning - very active and rapidly expanding areas and sought after skills in AI and robotics - as well as training from the Australian team in cutting edge neuroscientific methods (electrophysiology and pharmacology combined with virtual reality-enabled behavioural experiments). This will prepare them for careers across academia and industry. In addition, UK co-I Mangan, as an early career researcher, will benefit from support and advice from the senior investigators (UK + Aus), supporting his development as an independent researcher.

Advocacy + general public
We firmly believe in the benefit of ethically aware technology development through responsible innovation. We have already created an ethical code of conduct for the Brains on Board project and engaged with government consultations. We will extend this work and, by promoting and adhering to this philosophy, we will have impact on policy through advocacy and on the general public, through continuation of our extensive public engagement activities e.g. regular public lectures (Cafe Scientifique, Nerd Nite, U3A etc), media appearances (BBC, ABC radio, BBC Southeast) and large outreach events (e.g. Royal Society Science Exhibition 2010, British Science Festival 2017, Brighton Science Festivals, 2010-2018).

Academic Impact
We will impact AI, Robotics and neuroscience (see Academic Beneficiaries).

Publications

10 25 50

publication icon
Turner J (2022) mlGeNN: accelerating SNN inference using GPU-enabled neural networks in Neuromorphic Computing and Engineering

publication icon
Ozdemir A (2022) EchoVPR: Echo State Networks for Visual Place Recognition in IEEE Robotics and Automation Letters

publication icon
Nascimento L (2022) The G20 emission projections to 2030 improved since the Paris Agreement, but only slightly. in Mitigation and adaptation strategies for global change

publication icon
Manneschi L (2020) An alternative to backpropagation through time in Nature Machine Intelligence

publication icon
Manneschi L (2023) SpaRCe: Improved Learning of Reservoir Computing Systems Through Sparse Representations. in IEEE transactions on neural networks and learning systems

publication icon
Manneschi L (2021) Exploiting Multiple Timescales in Hierarchical Echo State Networks in Frontiers in Applied Mathematics and Statistics

publication icon
MaBouDi H (2020) Honeybees solve a multi-comparison ranking task by probability matching. in Proceedings. Biological sciences

publication icon
MaBouDi H (2021) Non-numerical strategies used by bees to solve numerical cognition tasks. in Proceedings. Biological sciences

publication icon
Günthermann L (2021) Activity and Behavior Computing

publication icon
Garvie M (2020) Evolved Transistor Array Robot Controllers. in Evolutionary computation

publication icon
Buehlmann C (2020) Multimodal interactions in insect navigation. in Animal cognition

publication icon
Balfour NJ (2022) DoPI: The Database of Pollinator Interactions. in Ecology

 
Description Emergent embodied cognition in shallow, biological and artificial, neural networks
Amount £200,036 (GBP)
Funding ID BB/X01343X/1 
Organisation Biotechnology and Biological Sciences Research Council (BBSRC) 
Sector Public
Country United Kingdom
Start 02/2023 
End 08/2024
 
Description Leverhulme Doctoral Scholarships in "be.AI - biomimetic embodied Artificial Intelligence"
Amount £1,350,000 (GBP)
Funding ID DS-2020-065 
Organisation The Leverhulme Trust 
Sector Charity/Non Profit
Country United Kingdom
Start 09/2021 
End 08/2027
 
Title New tool to rapidly and accurately reconstruct compound vision systems 
Description New tool to rapidly and accurately reconstruct compound vision systems. The tool uses modern ray tracing graphics technologies to produce entirely new levels of accuracy. Tool is open sourced via github. 
Type Of Material Technology assay or reagent 
Year Produced 2022 
Provided To Others? Yes  
Impact tba 
URL https://github.com/BrainsOnBoard/compound-ray
 
Title Data for paper: Wood ants learn the magnetic direction of a route but express uncertainty because of competing directional cues 
Description Data for paper published in Journal of Experimental Biology July 2022 The data for each Ant in the experiments described in all but Figure 7 is held in matlab files with the name as follows: AntU_LN22WESTtest_1522_31072019_Published.mat The data for the experiments with two triangles is held in the zip file TrianglesData.zip which has individual files in the same format as above There is a detailed description of the variables in the file ants_magnets_philippides_dataset_description.pdf Abstract Wood ants were trained indoors to follow a magnetically specified route that went from the centre of an arena to a drop of sucrose at the edge. The arena, placed in a white cylinder, was in the centre of a 3D coil system generating an inclined Earth-strength magnetic field in any horizontal direction. The specified direction was rotated between each trial. The ants' knowledge of the route was tested in trials without food. Tests given early in the day, before any training, show that ants remember the magnetic route direction overnight. During the first 2 seconds of a test, ants mostly faced in the specified direction, but thereafter were often misdirected, with a tendency to face briefly in the opposite direction. Uncertainty about the correct path to take may stem in part from competing directional cues linked to the room. In addition to facing along the route, there is evidence that ants develop magnetically directed home and food vectors dependent upon path integration. A second experiment asked whether ants can use magnetic information contextually. In contrast to honeybees given a similar task, ants failed this test. Overall, we conclude that magnetic directional cues can be sufficient for route learning. 
Type Of Material Database/Collection of data 
Year Produced 2022 
Provided To Others? Yes  
URL https://sussex.figshare.com/articles/dataset/Data_for_paper_Wood_ants_learn_the_magnetic_direction_o...
 
Title Data for paper: Wood ants learn the magnetic direction of a route but express uncertainty because of competing directional cues 
Description Data for paper published in Journal of Experimental Biology July 2022 The data for each Ant in the experiments described in all but Figure 7 is held in matlab files with the name as follows: AntU_LN22WESTtest_1522_31072019_Published.mat The data for the experiments with two triangles is held in the zip file TrianglesData.zip which has individual files in the same format as above There is a detailed description of the variables in the file ants_magnets_philippides_dataset_description.pdf Abstract Wood ants were trained indoors to follow a magnetically specified route that went from the centre of an arena to a drop of sucrose at the edge. The arena, placed in a white cylinder, was in the centre of a 3D coil system generating an inclined Earth-strength magnetic field in any horizontal direction. The specified direction was rotated between each trial. The ants' knowledge of the route was tested in trials without food. Tests given early in the day, before any training, show that ants remember the magnetic route direction overnight. During the first 2 seconds of a test, ants mostly faced in the specified direction, but thereafter were often misdirected, with a tendency to face briefly in the opposite direction. Uncertainty about the correct path to take may stem in part from competing directional cues linked to the room. In addition to facing along the route, there is evidence that ants develop magnetically directed home and food vectors dependent upon path integration. A second experiment asked whether ants can use magnetic information contextually. In contrast to honeybees given a similar task, ants failed this test. Overall, we conclude that magnetic directional cues can be sufficient for route learning. 
Type Of Material Database/Collection of data 
Year Produced 2022 
Provided To Others? Yes  
URL https://sussex.figshare.com/articles/dataset/Data_for_paper_Wood_ants_learn_the_magnetic_direction_o...
 
Title Dataset for paper "mlGeNN: Accelerating SNN inference using GPU-Enabled Neural Networks" 
Description Dataset for paper accepted in IOP Neuromorphic Computing and Engineering March 2022Dataset contains trained weights from TensorFlow 2.4.0 for the following models:- vgg16_imagenet_tf_weights.h5 - VGG-16 model trained on ImageNet ILSVRC dataset - vgg16_tf_weights.h5 - VGG-16 model trained on CIFAR-10 dataset- resnet20_cifar10_tf_weights.h5 - ResNet-20 model trained on CIFAR-10 dataset- resnet34_imagenet_tf_weights.h5 - ResNet-34 model trained on ImageNet ILSVRCAbstract"In this paper we present mlGeNN - a Python library for the conversion of artificial neural networks (ANNs) specified in Keras to spiking neural networks (SNNs). SNNs are simulated using GeNN with extensions to efficiently support convolutional connectivity and batching. We evaluate converted SNNs on CIFAR-10 and ImageNet classification tasks and compare the performance to both the original ANNs and other SNN simulators. We find that performing inference using a VGG-16 model, trained on the CIFAR-10 dataset, is 2.5x faster than BindsNet and, when using a ResNet-20 model trained on CIFAR-10 with FewSpike ANN to SNN conversion, mlGeNN is only a little over 2x slower than TensorFlow."FundingBrains on Board grant number EP/P006094/1ActiveAI grant number EP/S030964/1Unlocking spiking neural networks for machine learning research grant number EP/V052241/1European Union's Horizon 2020 research and innovation program under Grant Agreement 945539 
Type Of Material Database/Collection of data 
Year Produced 2022 
Provided To Others? Yes  
URL https://sussex.figshare.com/articles/dataset/Dataset_for_paper_mlGeNN_Accelerating_SNN_inference_usi...
 
Title Dataset for paper "mlGeNN: Accelerating SNN inference using GPU-Enabled Neural Networks" 
Description Dataset for paper accepted in IOP Neuromorphic Computing and Engineering March 2022Dataset contains trained weights from TensorFlow 2.4.0 for the following models:- vgg16_imagenet_tf_weights.h5 - VGG-16 model trained on ImageNet ILSVRC dataset - vgg16_tf_weights.h5 - VGG-16 model trained on CIFAR-10 dataset- resnet20_cifar10_tf_weights.h5 - ResNet-20 model trained on CIFAR-10 dataset- resnet34_imagenet_tf_weights.h5 - ResNet-34 model trained on ImageNet ILSVRCAbstract"In this paper we present mlGeNN - a Python library for the conversion of artificial neural networks (ANNs) specified in Keras to spiking neural networks (SNNs). SNNs are simulated using GeNN with extensions to efficiently support convolutional connectivity and batching. We evaluate converted SNNs on CIFAR-10 and ImageNet classification tasks and compare the performance to both the original ANNs and other SNN simulators. We find that performing inference using a VGG-16 model, trained on the CIFAR-10 dataset, is 2.5x faster than BindsNet and, when using a ResNet-20 model trained on CIFAR-10 with FewSpike ANN to SNN conversion, mlGeNN is only a little over 2x slower than TensorFlow."FundingBrains on Board grant number EP/P006094/1ActiveAI grant number EP/S030964/1Unlocking spiking neural networks for machine learning research grant number EP/V052241/1European Union's Horizon 2020 research and innovation program under Grant Agreement 945539 
Type Of Material Database/Collection of data 
Year Produced 2022 
Provided To Others? Yes  
URL https://sussex.figshare.com/articles/dataset/Dataset_for_paper_mlGeNN_Accelerating_SNN_inference_usi...
 
Description Bruno van Swinderen 
Organisation University of Queensland
Country Australia 
Sector Academic/University 
PI Contribution They are project partners on the ActiveAI grant. We will have reciprocal visits of PIs and Postdocs when they are allowed. We are helping them with modelling and computational analysis of data.
Collaborator Contribution They are project partners on the ActiveAI grant. We will have reciprocal visits of PIs and Postdocs when they are allowed. They are providing us with results from insect experiments which will inform our robotic models
Impact the collaboration is multi-disciplinary: we do robotics and computational modelling, they do insect neuroscience
Start Year 2019
 
Description Co-tutelle PhD studentship with Dr Ajay Narendra, Univ of Macqaurie, Australia 
Organisation Macquarie University
Department Department of Biological Sciences
Country Australia 
Sector Academic/University 
PI Contribution Linked to the ActiveAI program we combined funds from Univ of Sheffield and Univ of Macqaurie to fund a co-tutelle PhD. Set to start in Autumn, 2023
Collaborator Contribution Co funded PhD.
Impact non yet
Start Year 2023
 
Description Collaboration project with Dr. Cywn Solve (Macquarie University) to investigate impact of brain size on learning speed in temporal learning tasks. 
Organisation Macquarie University
Country Australia 
Sector Academic/University 
PI Contribution AO, EV, MM and LM developed computational models that supported data captured from the collaborators in bumble bees and hummingbirds. All contributed to a paper submitted to Science in 2021
Collaborator Contribution Collaborators provided behavioural data related to learning speeds in two species of bee and hummingbirds. Wrote collaborative paper.
Impact Redrafting paper following feedback from Science. Multidisciplinary research with Sheffield providing the machine learning / computational neuroscience expertise and our Australian partners the behavioral and neuroscience expertise
Start Year 2021
 
Description Collaboration with Macquarie University 
Organisation Macquarie University
Country Australia 
Sector Academic/University 
PI Contribution They are project partners on the ActiveAI grant. We will have reciprocal visits of PIs and Postdocs when they are allowed. We are helping them with modelling and computational analysis of data.
Collaborator Contribution They are project partners on the ActiveAI grant. We will have reciprocal visits of PIs and Postdocs when they are allowed. They are providing us with results from insect experiments which will inform our robotic models
Impact the collaboration is multi-disciplinary: we do robotics and computational modelling, they do insect neuroscience
Start Year 2019
 
Description Karin Nordstrom 
Organisation Flinders University
Country Australia 
Sector Academic/University 
PI Contribution They are project partners on the ActiveAI grant. We will have reciprocal visits of PIs and Postdocs when they are allowed. We are helping them with modelling and computational analysis of data.
Collaborator Contribution They are project partners on the ActiveAI grant. We will have reciprocal visits of PIs and Postdocs when they are allowed. They are providing us with results from insect experiments which will inform our robotic models
Impact the collaboration is multi-disciplinary: we do robotics and computational modelling, they do insect neuroscience
Start Year 2019
 
Title Code related to paper - EchoVPR: Echo State Networks for Visual Place Recognition 
Description Code to train and apply Echo-State-Networks to the problem of visual place recognition. 
Type Of Technology New/Improved Technique/Technology 
Year Produced 2022 
Open Source License? Yes  
Impact Acceptance to leading robotics conference ICRA 2022, and publication in leading robotics journal Robotics and Automation Letters. 
URL https://anilozdemir.github.io/EchoVPR/
 
Title CompoundRay: An open-source tool for high-speed and high-fidelity rendering of compound eye 
Description CompoundRay is new open-source renderer that accurately renders the visual perspective of insect eyes at over 5,000 frames per second in a 3D mapped natural environment. It supports ommatidial arrangements at arbitrary positions with per-ommatidial heterogeneity. 
Type Of Technology New/Improved Technique/Technology 
Year Produced 2021 
Impact The basis for investigation of insect shape on visual homing tasks - Blayze Millward. A new collaboration with researchers at Flinders University (Dr Karin Nordstrum) A new collaboration with researchers at DeepMind (Dr Chrisantha Fernando) 
URL https://www.biorxiv.org/content/10.1101/2021.09.20.461066v1
 
Description Andy Phillipides was interviewed by BBC south east about a 'robotic' grape harvester 
Form Of Engagement Activity A press release, press conference or response to a media enquiry/interview
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Public/other audiences
Results and Impact TV interview with local media to discuss a 'robotic' grape harvester
Year(s) Of Engagement Activity 2020
 
Description Article in The Times following AAAS 2020 
Form Of Engagement Activity A press release, press conference or response to a media enquiry/interview
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact The Times published an article on the project 'Bees help drones to find their bearings', on 17/02/2020, following on from us exhibiting at AAAS 2020 with URKI on 15/02/2020.
Year(s) Of Engagement Activity 2020
URL https://www.thetimes.co.uk/article/bees-help-drones-to-find-their-bearings-jfnfgs8x2
 
Description Article published by The Telegraph following on from AAAS 2020 
Form Of Engagement Activity A press release, press conference or response to a media enquiry/interview
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact The Telegraph published an article on the project 'Bees are being mapped to help develop driverless cars and drones by scientists glueing tiny antennas to their heads', on 17/02/2020, following on from us exhibiting at AAAS 2020 with URKI on 15/02/2020.
Year(s) Of Engagement Activity 2020
URL https://www.telegraph.co.uk/news/2020/02/17/bees-mapped-help-develop-driverless-cars-drones-scientis...
 
Description Catalan government AI funding report 
Form Of Engagement Activity A formal working group, expert panel or dialogue
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Policymakers/politicians
Results and Impact I contributed to a report on AI funding for digital innovation hubs presented to the Barcelona Chamber of commerce and the Catalan Regional Government
Year(s) Of Engagement Activity 2021
 
Description Exhibit at AAAS 2020 International Reception hosted by UKRI 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Industry/Business
Results and Impact The project was invited by UKRI to showcase at their VIP International Reception, hosted as part of AAAS 2020 in Seattle, February 2020. James Marshall, Alex Cope, Jamie Knight and Joe Woodgate exhibited the project,providing an overview of the research and technology emerging from the project, and demonstrations of drone and ground-based robotics. This resulted in significant international media coverage of the project.
Year(s) Of Engagement Activity 2020
URL https://www.ukri.org/aaas/
 
Description Financial Times article following AAAS 2020 
Form Of Engagement Activity A press release, press conference or response to a media enquiry/interview
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact The Financial Times published an article on the project 'Scientists look to bees to develop drone technology', on 17/02/2020, following on from us exhibiting at AAAS 2020 with URKI on 15/02/2020.
Year(s) Of Engagement Activity 2020
URL https://www.ft.com/content/bf3c83fe-5081-11ea-8841-482eed0038b1
 
Description Interview with Sky News following AAAS 2020 
Form Of Engagement Activity A press release, press conference or response to a media enquiry/interview
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact James Marshall was interviewed about the project by Sky News on 18/02/2020, following on the project exhibiting with UKRI at AAAS 2020, 15/02/2020.
Year(s) Of Engagement Activity 2020
 
Description NATO Autonomy Workshop 
Form Of Engagement Activity A formal working group, expert panel or dialogue
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Policymakers/politicians
Results and Impact Four members of the project team (James Marshall, Alex Cope, Joe Woodgate & Jamie Knight) attended as invited speakers and panellists on a workshop provided by NATO.
Year(s) Of Engagement Activity 2020
 
Description New Scientist Comment by Prof. James Marshall 
Form Of Engagement Activity A magazine, newsletter or online publication
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Public/other audiences
Results and Impact Comment piece in the 'Life' section of the publication by Prof' James Marshall.
Year(s) Of Engagement Activity 2021
URL https://www.newscientist.com/article/mg24933220-100-insect-brains-will-teach-us-how-to-make-truly-in...
 
Description PIP talk on AI 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach Local
Primary Audience Public/other audiences
Results and Impact I gave a public engagement talk on the imapct of AI to a (mainly local) group
Year(s) Of Engagement Activity 2021
 
Description Speaker at 60 years of Sussex Research Partnership Conference 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Other audiences
Results and Impact I was selected to present the research of Brains on Board and ActiveAI as part of the 60 years of Sussex Research Partnership Conference to highlight academice partnerships
Year(s) Of Engagement Activity 2022
URL https://www.sussex.ac.uk/about/60-years-of-sussex/news-and-events?id=57433
 
Description UKRI-BBSRC Expert Working Group on the Use of Models in research 
Form Of Engagement Activity A formal working group, expert panel or dialogue
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Policymakers/politicians
Results and Impact I was a member of the UKRI-BBSRC Expert Working Group on the Use of Models in research. We debated the subject which was made into a report which we fed back one
Year(s) Of Engagement Activity 2021
 
Description Virtual Insect Navigation Workshop Aug 4th-6th 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Presentation at the Virtual Insect Navigation Workshop Aug 4th-6th
Year(s) Of Engagement Activity 2020
 
Description work featured on Youtube on scishow 
Form Of Engagement Activity A broadcast e.g. TV/radio/film/podcast (other than news/press)
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact My bumblebee research was focussed on in an episode of Scishow on Youtube
Year(s) Of Engagement Activity 2021
URL https://www.youtube.com/watch?v=qqIPe3Ya8y0