Fly-by-feel: the neural representation of aeroelasticity

Lead Research Organisation: Imperial College London
Department Name: Bioengineering

Abstract

Mechanosensation is the fastest sensory modality in animals, and insects exploit this fully in flight control. In particular, the insect wings are adorned with hundreds of mechanosensors distributed in a well-organized manner. These sensors perform multiple tasks, but many are used to encode the unsteady aerodynamics of flight. Several lines of evidence to date suggest that wing sensors provide essential sensory feedback to modulate the motor pattern, gate other sensory signals, implement aerial reflexes and possibly contribute to motor-learning. In contrast, human approaches to evaluating unsteady, or time-dependent, aerodynamic phenomena often require heavy instrumentation and expensive computational power to simulate or measure the fluid flows. How do insects, with acutely limited computational resources, extract relevant information using smart sensor distribution, sensor tuning and signal encoding? What is the neural representation of the sensation of fluid? These are key questions that have implications for our understanding of mechanosensation in general; the results will be directly applicable to aviation and adaptive motion control for modern robotics.

The principal goal of this proposed research is to unite cutting edge neurophysiology with state-of-art computational fluid and structural dynamics to reveal how fluid sensing is realized in insect wings. We will first catalog the wing sensor distribution and types in selected species of dragonflies, hawkmoths, and locusts. Second, we will characterize the output signals of representative sensors individually and as populations. With the recently developed ultra-light neural recording device, we will monitor selected wing sensors while the insects are in-flight. Simultaneous we will digitize the detailed wing deformation in order to reconstruct the mechanical stimuli the wing sensors register and the aerodynamic features associated with the wing deformation. Finally, we will combine all the data and link aerodynamics to wing deformation, and to mechanosensory signals.

Both dragonflies and hawkmoths are excellent fliers that rely on flight to intercept flying prey or to feed from a flower. They are great systems for studying the neural encoding of fluid sensing because they have exposed sensors, relatively simple nervous system and sufficient payload to carry wireless neural devices while performing flight maneuvers. Our work will provide fundamental understanding of how fluid sensing is achieved with only a few sensors and insights to how complex mechanics can be represented in a nervous system. Sensor robustness, cost, and the associated computational power requirements are major issues of concern for modern aviation. The next generation of manned and unmanned air vehicles will require fundamentally redesigned control architectures based on biological designs. Our findings will illuminate the evolution of fluid sensing in animals and also inform the design of future aircraft.

The use of mechanosensors in locomotor control is fundamental to most animals at some stage in their lives. Therefore, this work will answer fundamental questions of locomotion relevant to all animal taxa. Ultimately, this project will thoroughly change the way biologists see how neural encoding links sensory cues from self-motion and the environment to behavioral commands.

Technical Summary

Mechanosensation is ubiquitous and central to adaptive motion control. However, there is currently only a poor understanding of the sensory input signals elicited by complex forces (e.g. friction, acoustics, and fluid forces). During flight, insects are masters of monitoring the instantaneous loads on their wings, exemplifying the encoding of dynamic complexity in a tractable system. In this work, we will uncover how insects extract relevant aerodynamic features via an intricately organized suite of mechanosensors on the wings.

Our work is made possible by three recent technical breakthroughs. First, by miniaturizing neural amplifiers and developing neural implants for large insects, we are now able to record from sensory neurons in free-flying animals. This allows us to eavesdrop on the mechanosensory signals from the wings during flight. Second, we have developed a high-precision motion-capture protocol to digitize flying insect kinematics in real-time, allowing us to trigger high-speed cameras that capture wing motion and deformation. Third, we can use coupled computational fluid dynamics and structural dynamics models. This tool enables us to reconstruct and predict the aerodynamic and mechanical stimuli the wing sensors measure (via fluid-solid coupling). Combining these three technologies, we can reveal the neural representation of aeroelasticity for the first time.

Recording neural activities in freely moving animals offers direct answers to how neural networks compute sensory information to generate actions. Neural telemetry systems enable this approach, yet recording from a moving subject and reconstructing sensory stimuli during free behaviors remains a formidable task. In this work, we will demonstrate how to record wirelessly from a fast moving appendage on a moving subject, and reconstruct complex sensory stimuli with computational models. This breaks new ground and will advance substantially the field of wireless neurophysiology.

Planned Impact

Fluid sensing is the least understood of all mechanosensations, yet nature achieves this with robustness and finesse. Engineered systems from wind turbines to cardiovascular implants are designed based on pre-calculated (or pre-calibrated) steady-state conditions the system typically experiences but there is no easy way to extrapolate to unsteady fluid states to maintain effective, optimal function. Even with modern computational tools, it takes much time and effort to simulate and understand complex fluid dynamics. This project aims to understand how highly dynamic flying insects sense and encode aeroelasticity in flight. The result will reveal how a relatively simple nervous system extracts relevant aerodynamic features through a small yet strategically patterned sensor array. Our findings will have direct impact on engineering designs and control systems for any application involving fluid manipulation.

With the development of smart cameras from the late 80's, vision research in neuroscience and computational biology flourished, revealing how animals perceive the world. Today, we have sophisticated machine vision systems driving a revolution in artificial intelligence (AI) and automated industries (eg automobile, food processing, farming, pharmaceutical). Similarly, mechanosensation is poised to provide similar insights for systems neuroscience and bio-inspired engineering in the next few decades. While AI robotic systems can provide timely and relevant information, they are limited in physical performance due to the challenge of adaptive motion control. Optical systems are relatively slow and computationally costly. To advance beyond this barrier, AI systems must be able not only to see, but also to feel. With the rapid emergence of civilian drones, self-driving cars and medical robots, that sense of touch might be all we need to make things safe and reliable. In 2016 the Ministry of Defence announced £800m for the Defence Innovation Initiative, including a focus on bio-inspired innovation to tackle current security challenges. The expected outputs from this research project are crucial to delivering impact in this sector.

Our outputs will have immediate impact in the field of aerial robotics. Our findings will promote the development of bio-inspired sensing mechanisms on the wings, body, or rotors of next generation aerial systems, where aeroelastic information will become the principal input for flight controllers. Instantaneous loading information will help aerial systems achieve greater control, more agile flight maneuvers, and to harvest energy from the atmosphere for improved economy. Similarly, incorporating aeroelastic sensory feedback will allow manned aircraft to respond more rapidly to turbulence or the onset of stall. Beyond aviation, effective fluid sensing can improve the design and control of wind-turbines, hydroelectric turbines, and even combustion engines. In biomedical applications, cardiovascular implants and endoscopy would benefit from enhanced state information. Our work will show how to capture unsteady aeroelastic information in real-time which is a new discovery in biological mechanosensing, and will underpin a new domain for technological innovations. Large-scale stakeholder sectors include (but are not limited to) medical instrumentation, aerial robotics, soft robotics, and renewable energy (e.g. wind and tidal turbines), each of which contribute to human health and well-being. Success in these fields will have a positive economic impact and maintain the UK at the forefront of technology innovation.

Publications

10 25 50
 
Description The stage 1 of this project has been published in early 2022. We produced the first complete map of wing mechanosensors in any flying animal. We discovered over 700 sensory neurons in each dragonfly wing and over 300 in a damselfly. We identified 8 classes of wing sensors including their morphologies, distribution and wiring. We discussed their potential functions in relation to our high-fidelity finite element model of the dragonfly wing.

Stage 2 of the project focuses on identifying what the wing sensory system encode. We excited dragonfly wings with various airflow stimuli and recorded the neuron responses. Extensive analyse allowed us to identify the principle wing deformation modes the neurons encode. We have mapping these wing deformation modes to the free-flight data to identify the function in flight control. We are preparing a manuscript to be submitted this summer.

The synthesis part of the project involved incorporating wing mechanosensation for flight control. Some ideas have been developed on a few autonomous glider platforms. Some tests are on-going and the results will be published separately.
Exploitation Route The sensory maps as well as some preliminary biomechanics models will guide the research efforts in flight-by-feel topics across animal models. Future applications include morphing wing control and kite-power platform design.
Sectors Aerospace, Defence and Marine,Education,Energy,Environment,Transport

 
Title Systematic characterization of wing mechanosensors that monitor airflow and wing deformations 
Description Animal wings deform during flight in ways that can enhance lift, facilitate flight control, and mitigate damage. Monitoring the structural and aerodynamic state of the wing is challenging because deformations are passive, and the flow fields are unsteady; it requires distributed mechanosensors that respond to local airflow and strain on the wing. Without a complete map of the sensor arrays, it is impossible to model control strategies underpinned by them. Here, we present the first systematic characterization of mechanosensors on the dragonfly's wings: morphology, distribution, and wiring. By combining a cross-species survey of sensor distribution with quantitative neuroanatomy and a high-fidelity finite element analysis, we show that the mechanosensors are well-placed to perceive features of the wing dynamics relevant to flight. This work describes the wing sensory apparatus in its entirety and advances our understanding of the sensorimotor loop that facilitates exquisite flight control in animals with highly deformable wings. 
Type Of Material Database/Collection of data 
Year Produced 2022 
Provided To Others? Yes  
URL http://datadryad.org/stash/dataset/doi:10.5061/dryad.h18931zns
 
Description Collaboration with HHMI Janelia Research Campus 
Organisation Howard Hughes Medical Institute
Department Janelia Research Campus
Country United States 
Sector Academic/University 
PI Contribution We have collaborated with Dr Igor Siwanowicz for wing neuroanatomy imaging. Based on our imaging data, we have directed the coordinated tracing the wing neurons for the dragonfly.
Collaborator Contribution Dr Igor Siwanowicz has provided his expertise in sample preparation and imaging dragonfly wings.
Impact The collaboration has yielded that best neuroanatomy data possible for dragonfly wing mechanosensory neurons. We are continuing this collaboration to dissect the mechanotransduction mechanisms underlying some of these wing sensors.
Start Year 2018
 
Description Collaboration with The Natural History Museum 
Organisation Natural History Museum
Country United Kingdom 
Sector Public 
PI Contribution The NHM has kindly allowed us to image and characterise some dragonfly wing specimen for comparative studies. We have also exchanged techniques in microscopy and micro-CT.
Collaborator Contribution Our partnership with NHM has led to many fruitful discussions about the life history any morphological diversities of insect wings. We are currently planning further collaborations on projects in comparative anatomy.
Impact The collaboration is still in its infancy. We will report more substantial output in the next round of report.
Start Year 2018
 
Description Exhibition Rd Festival 2019 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact We hosted a stand as part of the Exhibition Rd Festival and presented our research on Fly-by-Feel to the visitors. The exhibition included interactive displays, insect specimens, video presentations, as well as information handouts.
Year(s) Of Engagement Activity 2019
 
Description Meet a Scientist 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Schools
Results and Impact Presentation and Q&A session for kids from disadvantaged backgrounds, together with NOVA, a not-for-profit organization.
Year(s) Of Engagement Activity 2021
 
Description Natural History Museum European Research Night 2018 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact In collaboration with the NHM, we exhibited at the European Research Night 2019. In addition to a set of giant interactive dragonfly wings, we also created an VR experience of dragonfly hunting experience. This was hugely popular among children and it gave us an opportunity to explain the challenge of flying at high-speed while tracking small targets.
Year(s) Of Engagement Activity 2018
URL http://www.nhm.ac.uk/discover/news/2018/june/dragonfly-wings-could-inspire-new-aeroplane-flight-cont...
 
Description The Royals Society Summer Science Exhibition: Nurturing Nature's Innovations 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact We have competed and acquired a stand at The Royal Society Summer Science Exhibition 2018. It was an 7-day full-time science outreach event in which we engaged with the visitors (i.e. school groups, families, general public, FRS, media, government representatives, students) of the exhibition. We designed and manufactured a set of 1.5m long dragonfly wing interactive model to showcase the anatomy and distribution of the mechanosensors on insect wings.
Year(s) Of Engagement Activity 2018
URL https://royalsociety.org/science-events-and-lectures/2018/summer-science-exhibition/exhibits/nurturi...