Long-Term Affordable Navigation in Unseen Agricultural Environments
Lead Research Organisation:
University of Lincoln
Department Name: School of Computer Science
Abstract
Navigation in agricultural environments is an essential component for a robot to be competent in tasks such as monitoring crops, spraying, fertilising, sowing seeds etc. to ensure repeatability and farmers' safety. The agricultural domain imposes extra challenges as it is dynamic (crop growth), lacks texture (perceptual aliasing) and the scene is subject to different light illuminations. Additionally, to ensure accurate localisation, it is common for the robot to be dependent on constant communication with GNSS and RTK sensors.
Currently, most agricultural navigation systems rely on satellite localisation using GNSS or GNSS-RTK systems. However, reliable high-end GNSS sensors are expensive, and the satellite signal might not be always available (e.g., the robot moves under tree canopies or interferes with other RF sensors). Visual-based navigation aims to provide affordable and accurate navigation in GNSS-RTK denied environments.
Additionally, recent advancements in deep learning have enabled affordable learning of traversable paths in a contrastive manner, which can provide generalisation in unseen environments. The generalisation of these networks is two-fold. Firstly, farms with repetitive rows/trellis of crops can facilitate successful traverses of multiple rows without mapping the whole field. Secondly, the model can assimilate the seasonal and the changes in vegetation providing robust life-long navigation.
There are many successful implementations of deep learning-based systems for both outdoor and indoor environments that are robust in seasonal and small gradual changes in the environment. However not even one yet has been implemented in an agricultural environment and furthermore, not even one has been demonstrated on an unseen path. Additionally, much work has been done around long-term autonomy to address the appearance problem caused by seasonal and illumination variation, but the constant plant growth and the perceptual aliasing which is present in agricultural scenes has not yet been explored.
The proposed project aims to address and investigate, the conditions under which an affordable visual-based navigation system, can perform autonomous long-term navigation in a GNSS-RTK denied agricultural fields and how well can generalise on unseen environments. Specifically, the proposed research aims to examine the challenges introduced by the agricultural domain. More precisely the main objectives are:
- Implement a self-supervised image registration pipeline based on affordable sensors, robust to illumination and seasonal changes, to enable long-term navigation.
- Examine how well it generalises on unseen rows/trellis controlling for parameters such as the field, the crop, and the length of the path.
Learning general visual representations of crop rows/trellis robust to seasonal and vegetation changes enable farmers to utilise cost effective robots able to navigate in unmapped terrains without relying on expensive sensors. Additionally, these representations can ensure life-long navigation eliminating the need of updating navigation actions based on the current state of the farm.
Currently, most agricultural navigation systems rely on satellite localisation using GNSS or GNSS-RTK systems. However, reliable high-end GNSS sensors are expensive, and the satellite signal might not be always available (e.g., the robot moves under tree canopies or interferes with other RF sensors). Visual-based navigation aims to provide affordable and accurate navigation in GNSS-RTK denied environments.
Additionally, recent advancements in deep learning have enabled affordable learning of traversable paths in a contrastive manner, which can provide generalisation in unseen environments. The generalisation of these networks is two-fold. Firstly, farms with repetitive rows/trellis of crops can facilitate successful traverses of multiple rows without mapping the whole field. Secondly, the model can assimilate the seasonal and the changes in vegetation providing robust life-long navigation.
There are many successful implementations of deep learning-based systems for both outdoor and indoor environments that are robust in seasonal and small gradual changes in the environment. However not even one yet has been implemented in an agricultural environment and furthermore, not even one has been demonstrated on an unseen path. Additionally, much work has been done around long-term autonomy to address the appearance problem caused by seasonal and illumination variation, but the constant plant growth and the perceptual aliasing which is present in agricultural scenes has not yet been explored.
The proposed project aims to address and investigate, the conditions under which an affordable visual-based navigation system, can perform autonomous long-term navigation in a GNSS-RTK denied agricultural fields and how well can generalise on unseen environments. Specifically, the proposed research aims to examine the challenges introduced by the agricultural domain. More precisely the main objectives are:
- Implement a self-supervised image registration pipeline based on affordable sensors, robust to illumination and seasonal changes, to enable long-term navigation.
- Examine how well it generalises on unseen rows/trellis controlling for parameters such as the field, the crop, and the length of the path.
Learning general visual representations of crop rows/trellis robust to seasonal and vegetation changes enable farmers to utilise cost effective robots able to navigate in unmapped terrains without relying on expensive sensors. Additionally, these representations can ensure life-long navigation eliminating the need of updating navigation actions based on the current state of the farm.
Planned Impact
The proposed CDT provides a unique vision of advanced RAS technologies embedded throughout the food supply chain, training the next generation of specialists and leaders in agri-food robotics and providing the underpinning research for the next generation of food production systems. These systems in turn will support the sustainable intensification of food production, the national agri-food industry, the environment, food quality and health.
RAS technologies are transforming global industries, creating new business opportunities and driving productivity across multiple sectors. The Agri-Food sector is the largest manufacturing sector of the UK and global economy. The UK food chain has a GVA of £108bn and employs 3.6m people. It is fundamentally challenged by global population growth, demographic changes, political pressures affecting migration and environmental impacts. In addition, agriculture has the lowest productivity of all industrial sectors (ONS, 2017). However, many RAS technologies are in their infancy - developing them within the agri-food sector will deliver impact but also provide a challenging environment that will significantly push the state of art in the underpinning RAS science. Although the opportunity for RAS is widely acknowledged, a shortage of trained engineers and specialists has limited the delivery of impact. This directly addresses this need and will produce the largest global cohort of RAS specialists in Agri-Food.
The impacts are multiple and include;
1) Impact on RAS technology. The Agri-Food sector provides an ideal test bed to develop multiple technologies that will have application in many industrial sectors and research domains. These include new approaches to autonomy and navigation in field environments; complex picking, grasping and manipulation; and novel applications of machine learning and AI in critical and essential sectors of the world economy.
2) Economic Impact. In the UK alone the Made Smarter Review (2017) estimates that automation and RAS will create £183bn of GVA over the next decade, £58bn of which from increased technology exports and reshoring of manufacturing. Expected impacts within Agri-Food are demonstrated by the £3.0M of industry support including the world largest agricultural engineering company (John Deere), the multinational Syngenta, one of the world's largest robotics manufacturers (ABB), the UK's largest farming company owned by James Dyson (one of the largest private investors in robotics), the UK's largest salads and fruit producer plus multiple SME RAS companies. These partners recognise the potential and need for RAS (see NFU and IAgrE Letters of Support).
3) Societal impact. Following the EU referendum, there is significant uncertainty that seasonal labour employed in the sector will be available going forwards, while the demographics of an aging population further limits the supply of manual labour. We see robotic automation as a means of performing onerous and difficult jobs in adverse environments, while advancing the UK skills base, enabling human jobs to move up the value chain and attracting skilled workers and graduates to Agri-Food.
4) Diversity impact. Gender under-representation is also a concern across the computer science, engineering and technology sectors, with only 15% of undergraduates being female. Through engagement with the EPSRC ASPIRE (Advanced Strategic Platform for Inclusive Research Environments) programme, AgriFoRwArdS will become an exemplar CDT with an EDI impact framework that is transferable to other CDTs.
5) Environmental Impact. The Agri-food sector uses 13% of UK carbon emissions and 70% of fresh water, while diffuse pollution from fertilisers and pesticides creates environmental damage. RAS technology, such as robotic weeders and field robots with advanced sensors, will enable a paradigm shift in precision agriculture that will sustainably intensify production while minimising environmental impacts.
RAS technologies are transforming global industries, creating new business opportunities and driving productivity across multiple sectors. The Agri-Food sector is the largest manufacturing sector of the UK and global economy. The UK food chain has a GVA of £108bn and employs 3.6m people. It is fundamentally challenged by global population growth, demographic changes, political pressures affecting migration and environmental impacts. In addition, agriculture has the lowest productivity of all industrial sectors (ONS, 2017). However, many RAS technologies are in their infancy - developing them within the agri-food sector will deliver impact but also provide a challenging environment that will significantly push the state of art in the underpinning RAS science. Although the opportunity for RAS is widely acknowledged, a shortage of trained engineers and specialists has limited the delivery of impact. This directly addresses this need and will produce the largest global cohort of RAS specialists in Agri-Food.
The impacts are multiple and include;
1) Impact on RAS technology. The Agri-Food sector provides an ideal test bed to develop multiple technologies that will have application in many industrial sectors and research domains. These include new approaches to autonomy and navigation in field environments; complex picking, grasping and manipulation; and novel applications of machine learning and AI in critical and essential sectors of the world economy.
2) Economic Impact. In the UK alone the Made Smarter Review (2017) estimates that automation and RAS will create £183bn of GVA over the next decade, £58bn of which from increased technology exports and reshoring of manufacturing. Expected impacts within Agri-Food are demonstrated by the £3.0M of industry support including the world largest agricultural engineering company (John Deere), the multinational Syngenta, one of the world's largest robotics manufacturers (ABB), the UK's largest farming company owned by James Dyson (one of the largest private investors in robotics), the UK's largest salads and fruit producer plus multiple SME RAS companies. These partners recognise the potential and need for RAS (see NFU and IAgrE Letters of Support).
3) Societal impact. Following the EU referendum, there is significant uncertainty that seasonal labour employed in the sector will be available going forwards, while the demographics of an aging population further limits the supply of manual labour. We see robotic automation as a means of performing onerous and difficult jobs in adverse environments, while advancing the UK skills base, enabling human jobs to move up the value chain and attracting skilled workers and graduates to Agri-Food.
4) Diversity impact. Gender under-representation is also a concern across the computer science, engineering and technology sectors, with only 15% of undergraduates being female. Through engagement with the EPSRC ASPIRE (Advanced Strategic Platform for Inclusive Research Environments) programme, AgriFoRwArdS will become an exemplar CDT with an EDI impact framework that is transferable to other CDTs.
5) Environmental Impact. The Agri-food sector uses 13% of UK carbon emissions and 70% of fresh water, while diffuse pollution from fertilisers and pesticides creates environmental damage. RAS technology, such as robotic weeders and field robots with advanced sensors, will enable a paradigm shift in precision agriculture that will sustainably intensify production while minimising environmental impacts.
Organisations
People |
ORCID iD |
| Nikolaos Tsagkopoulos (Student) |
Studentship Projects
| Project Reference | Relationship | Related To | Start | End | Student Name |
|---|---|---|---|---|---|
| EP/S023917/1 | 31/03/2019 | 13/10/2031 | |||
| 2601720 | Studentship | EP/S023917/1 | 30/09/2021 | 21/03/2023 | Nikolaos Tsagkopoulos |