Fly-by-Feel: the neural representation of aeroelasticity.

Lead Research Organisation: Royal Veterinary College
Department Name: Comparative Biomedical Sciences CBS

Abstract

Abstracts are not currently available in GtR for all funded research. This is normally because the abstract was not required at the time of proposal submission, but may be because it included sensitive information such as personal details.

Technical Summary

Mechanosensation is ubiquitous and central to adaptive motion control. However, there is currently only a poor understanding of the sensory input signals elicited by complex forces (e.g. friction, acoustics, and fluid forces). During flight, insects are masters of monitoring the instantaneous loads on their wings, exemplifying the encoding of dynamic complexity in a tractable system. In this work, we will uncover how insects extract relevant aerodynamic features via an intricately organized suite of mechanosensors on the wings.

Our work is made possible by three recent technical breakthroughs. First, by miniaturizing neural amplifiers and developing neural implants for large insects, we are now able to record from sensory neurons in free-flying animals. This allows us to eavesdrop on the mechanosensory signals from the wings during flight. Second, we have developed a high-precision motion-capture protocol to digitize flying insect kinematics in real-time, allowing us to trigger high-speed cameras that capture wing motion and deformation. Third, we can use coupled computational fluid dynamics and structural dynamics models. This tool enables us to reconstruct and predict the aerodynamic and mechanical stimuli the wing sensors measure (via fluid-solid coupling). Combining these three technologies, we can reveal the neural representation of aeroelasticity for the first time.

Recording neural activities in freely moving animals offers direct answers to how neural networks compute sensory information to generate actions. Neural telemetry systems enable this approach, yet recording from a moving subject and reconstructing sensory stimuli during free behaviors remains a formidable task. In this work, we will demonstrate how to record wirelessly from a fast moving appendage on a moving subject, and reconstruct complex sensory stimuli with computational models. This breaks new ground and will advance substantially the field of wireless neurophysiology.

Planned Impact

Fluid sensing is the least understood of all mechanosensations, yet nature achieves this with robustness and finesse. Engineered systems from wind turbines to cardiovascular implants are designed based on pre-calculated (or pre-calibrated) steady-state conditions the system typically experiences but there is no easy way to extrapolate to unsteady fluid states to maintain effective, optimal function. Even with modern computational tools, it takes much time and effort to simulate and understand complex fluid dynamics. This project aims to understand how highly dynamic flying insects sense and encode aeroelasticity in flight. The result will reveal how a relatively simple nervous system extracts relevant aerodynamic features through a small yet strategically patterned sensor array. Our findings will have direct impact on engineering designs and control systems for any application involving fluid manipulation.

With the development of smart cameras from the late 80's, vision research in neuroscience and computational biology flourished, revealing how animals perceive the world. Today, we have sophisticated machine vision systems driving a revolution in artificial intelligence (AI) and automated industries (eg automobile, food processing, farming, pharmaceutical). Similarly, mechanosensation is poised to provide similar insights for systems neuroscience and bio-inspired engineering in the next few decades. While AI robotic systems can provide timely and relevant information, they are limited in physical performance due to the challenge of adaptive motion control. Optical systems are relatively slow and computationally costly. To advance beyond this barrier, AI systems must be able not only to see, but also to feel. With the rapid emergence of civilian drones, self-driving cars and medical robots, that sense of touch might be all we need to make things safe and reliable. In 2016 the Ministry of Defence announced £800m for the Defence Innovation Initiative, including a focus on bio-inspired innovation to tackle current security challenges. The expected outputs from this research project are crucial to delivering impact in this sector.

Our outputs will have immediate impact in the field of aerial robotics. Our findings will promote the development of bio-inspired sensing mechanisms on the wings, body, or rotors of next generation aerial systems, where aeroelastic information will become the principal input for flight controllers. Instantaneous loading information will help aerial systems achieve greater control, more agile flight maneuvers, and to harvest energy from the atmosphere for improved economy. Similarly, incorporating aeroelastic sensory feedback will allow manned aircraft to respond more rapidly to turbulence or the onset of stall. Beyond aviation, effective fluid sensing can improve the design and control of wind-turbines, hydroelectric turbines, and even combustion engines. In biomedical applications, cardiovascular implants and endoscopy would benefit from enhanced state information. Our work will show how to capture unsteady aeroelastic information in real-time which is a new discovery in biological mechanosensing, and will underpin a new domain for technological innovations. Large-scale stakeholder sectors include (but are not limited to) medical instrumentation, aerial robotics, soft robotics, and renewable energy (e.g. wind and tidal turbines), each of which contribute to human health and well-being. Success in these fields will have a positive economic impact and maintain the UK at the forefront of technology innovation.

Publications

10 25 50
 
Title Immersive experience at AMNH 
Description An immersive experience has been created for visitors to the new Gilder Centre wing of the American Museum of Natural History in New York City. Experimental data from this project assisted with the artistic design of the new exhibit, scheduled to open in Spring 2023. 
Type Of Art Artistic/Creative Exhibition 
Year Produced 2023 
Impact There will be many visitors every year that will enjoy exhibit. Statistical data will follow when available. 
 
Description We have shown that dragonfly wings are innervated by an extensive collection of sensory neurons.
We have described how mechanosensors are spread across the whole span of the wing with consistent patterns between species.
We have measured axon diameters of wing sensory neurons and found that they are scaled to facilitate synchronous firing, which has important implications for the operation of the flight control system.
Based on detailed micro-CT data, we acquired anatomically accurate models of dragonfly wings for structural and fluid dynamics modelling.
Structural mechanical modelling reveals wing strain fields that can be matched to the sensor distribution.
Exploitation Route Our discoveries lay a foundation for novel bio-inspired flight controllers based on strain and flow sensing on wings.
Sectors Aerospace, Defence and Marine,Transport

 
Description Large scale public engagement via contribution to museum exhibit (AMNH)
First Year Of Impact 2023
Sector Creative Economy,Education,Leisure Activities, including Sports, Recreation and Tourism,Culture, Heritage, Museums and Collections
Impact Types Cultural,Societal,Economic

 
Description Bio-informed aerodynamic sensing for state estimation in agile UAS
Amount £133,263 (GBP)
Organisation Defence Science & Technology Laboratory (DSTL) 
Sector Public
Country United Kingdom
Start 11/2020 
End 10/2024
 
Title Systematic characterization of wing mechanosensors that monitor airflow and wing deformations 
Description Animal wings deform during flight in ways that can enhance lift, facilitate flight control, and mitigate damage. Monitoring the structural and aerodynamic state of the wing is challenging because deformations are passive, and the flow fields are unsteady; it requires distributed mechanosensors that respond to local airflow and strain on the wing. Without a complete map of the sensor arrays, it is impossible to model control strategies underpinned by them. Here, we present the first systematic characterization of mechanosensors on the dragonfly's wings: morphology, distribution, and wiring. By combining a cross-species survey of sensor distribution with quantitative neuroanatomy and a high-fidelity finite element analysis, we show that the mechanosensors are well-placed to perceive features of the wing dynamics relevant to flight. This work describes the wing sensory apparatus in its entirety and advances our understanding of the sensorimotor loop that facilitates exquisite flight control in animals with highly deformable wings. 
Type Of Material Database/Collection of data 
Year Produced 2022 
Provided To Others? Yes  
URL http://datadryad.org/stash/dataset/doi:10.5061/dryad.h18931zns
 
Description The Royals Society Summer Science Exhibition: Nurturing Nature's Innovations 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact We have competed and acquired a stand at The Royal Society Summer Science Exhibition 2018. It was an 7-day full-time science outreach event in which we engaged with the visitors (i.e. school groups, families, general public, FRS, media, government representatives, students) of the exhibition. We designed and manufactured a set of 1.5m long dragonfly wing interactive model to showcase the anatomy and distribution of the mechanosensors on insect wings.
Year(s) Of Engagement Activity 2018
URL https://royalsociety.org/science-events-and-lectures/2018/summer-science-exhibition/exhibits/nurturi...