Computer vision applications to UAV/drone survey tasks.

Lead Research Organisation: Durham University
Department Name: Engineering and Computing Sciences

Abstract

In recent years, drone technology (also called Unmanned Aerial Vehicles, UAVs) has progressed to the point where airborne photography has become accessible to the wider public. In combination with computer vision and 3D scene mapping, UAVs now offer accessible survey tools to the wider potential end-uwer community. However, 3D mapping from drone imagery does not yet lend itself easily to widespread use and some key technical limitations need to be overcome. Currently, 3D scene mapping is performed as an off-line process (post-survey) using UAV platforms that require several flights to cover any significant survey area (due to limited flight duration). Furthermore, the post-processing nature of the 3D renditions is not transparent and adds a significant impediment to any attempts at rapid post flight delivery. This PhD project aims to address these issues.

Publications

10 25 50

Studentship Projects

Project Reference Relationship Related To Start End Student Name
EP/N509462/1 01/10/2016 30/09/2021
1750094 Studentship EP/N509462/1 01/10/2016 31/03/2020 Bruna Maciel Pearson
 
Description This research aim is to investigate new and efficient ways for intelligent UAVs to perform an end-to-end operation and achieve mapping and autonomous navigation in an unstructured environment in real time. To this end, we have investigated and developed an alternative method for data gathering and processing that allows real-time data labelling. The model produced using our proposed Deep Neural Network is also capable of generalisation across different unstructured domains and is suitable to be used by UAV with differing sensor payload capabilities. Our second contribution to this goal consists of eliminating the need to follow a trail, by providing the UAV with the ability to identify the safest area to navigate, regardless of the current altitude and whether or not a trail is visible in the scene. This new approach increases the navigational control from three to six DoF (Degrees of Freedom) and can be generalised to different seasons or environments. The final contribution of this research is that it presents a hybrid network that combines Reinforcement Learning and Online Learning to improve navigation and coverage exploration while simultaneously mapping the environment.
Exploitation Route Search and Rescue (SAR) mission, visual exploration of disaster areas, aerial reconnaissance and surveillance and assessment of forest structures or riverscapes are challenging activities that have in common: an unstructured environment with varying luminosity conditions and an uneven terrain whereby the GPS signal may be unreliable. These challenges make navigation and exploration even hard to be achieved by the SAR teams. Although most field studies have primarily relied on the use of manually-controlled UAVs, the results to date have demonstrated the importance of identifying approaches to automate coverage search and path-finding operations of a UAV specifically for unstructured environments, which are the commonly found environments during SAR missions. Here the key advantage of a UAV is its ability to cover a larger area faster than any ground team of humans, reducing the vital search time. Through this research project we investigate new and efficient ways for intelligent UAVs to perform an end-to-end operation whereby the perception of unstructured environments, autonomous navigation and mapping is achieved in real-time. The outcomes of this funding can be applied in SAR operations for detection and assessment of victims on disaster areas; mapping of routes for transportation of supplies and forest and crop monitoring.
Sectors Aerospace, Defence and Marine,Agriculture, Food and Drink,Environment,Transport,Other

 
Description As STEM ambassador I have visiting schools and demonstrating my work for children from varied age groups. In addition to that, I have presented my work at Durham University Junior seminar, Intel AI DevJam / ICML (Stockholm/SE - July/2018), Durham University Research Day** (Durham/UK - May/2018), Intel Student Ambassador Summit / AI DevCon (California/US - May/2018), The UK - RAS Network Conference on Robotics and Autonomous Systems: robots working for and among us** (Bristol/UK - December/2017) and The British Machine Vision Association (BMVA) Summer School** - University of Lincoln (Lincoln/UK - Jul/2017). ** best poster prize winner
First Year Of Impact 2017
Sector Digital/Communication/Information Technologies (including Software),Education
Impact Types Cultural,Policy & public services

 
Title Extending Deep Neural Network Trail Navigation for Unmanned Aerial Vehicle Operation within the Forest Canopy - Evaluation [dataset] 
Description Dataset to support and allow reproduction of the findings used in the paper: Extending Deep Neural Network Trail Navigation for Unmanned Aerial Vehicle Operation within the Forest Canopy (B.G. Maciel-Pearson, P. Carbonneau, T.P. Breckon), In Proc. Towards Autonomous Robotic Systems Conference, Springer, 2018. 
Type Of Material Database/Collection of data 
Year Produced 2018 
Provided To Others? Yes  
Impact Not applicable at the moment, still early to identify notable impacts. 
URL https://collections.durham.ac.uk/files/r1st74cq45z#.XGwVl-j7Q2w