Learning robot navigation and manipulation from demonstrations

Lead Research Organisation: University of Lincoln
Department Name: School of Computer Science

Abstract

Objectives:
- To research and implement a Learning from demonstrations method of navigation for mobile robot to perform complex navigation tasks independent of their domain.
- To research implement a Learning from demonstrations method of navigation for mobile manipulator robots to perform complex manipulation tasks as well as complex navigation tasks independent of a robot's domain.
- To implement autonomy on a mobile manipulator machine, which is usually teleoperated by an operator, using the researched Learning from demonstrations methods.

Humans teleoperate machines to perform mobile navigation and manipulation tasks. Current autonomous system approaches are domain specific. Therefore, human operators are still in charge of the movement of their robots.

This research studies Learning from demonstrations (LfDs) so the same teleoperated machines can be transformed to perform autonomously.

The proposed research involves taking quantitative control data from human demonstrations. While a robot is learning a task from a demonstration, it must decipher useful task information from the noise in the control data. The research extends to not only just being able to replay the demonstration, but to also to adapting the execution of the task according to variations within the robot's environment.

LfDs methods for mobile robots and mobile manipulators already exist, however these methods do not generalise the task and depend on the robot's system dynamics being known. They also use sensors which are expensive such as LIDAR rather than camera sensors. The LfD methods I would research into, and implement on mobile robots and mobile manipulators, is inspired from the work into manipulator robots conducted by Dr. Amir Ghalamzan. However, remapping of the existing models for manipulator robots onto the mobile robots and mobile manipulators will not be enough to make these robots fully autonomous.

I will be looking further into state-of-the-art deep learning methods so that the robots do not only mimic or imitate the demonstrated task. But be able to generate ways of emulating demonstrations and include those demonstrations when learning the task. The idea is to improve the execution of the task and be able to generalise to be independent of the robot's domain.

The outcome of the research is to produce a computationally efficient and effective method of implementing autonomy on mobile machines. The human re-programmable nature of the LfDs for the robots will increase the level of robot adaptation, as robot experts will not be required to continually reprogram the robots.

Planned Impact

The proposed CDT provides a unique vision of advanced RAS technologies embedded throughout the food supply chain, training the next generation of specialists and leaders in agri-food robotics and providing the underpinning research for the next generation of food production systems. These systems in turn will support the sustainable intensification of food production, the national agri-food industry, the environment, food quality and health.

RAS technologies are transforming global industries, creating new business opportunities and driving productivity across multiple sectors. The Agri-Food sector is the largest manufacturing sector of the UK and global economy. The UK food chain has a GVA of £108bn and employs 3.6m people. It is fundamentally challenged by global population growth, demographic changes, political pressures affecting migration and environmental impacts. In addition, agriculture has the lowest productivity of all industrial sectors (ONS, 2017). However, many RAS technologies are in their infancy - developing them within the agri-food sector will deliver impact but also provide a challenging environment that will significantly push the state of art in the underpinning RAS science. Although the opportunity for RAS is widely acknowledged, a shortage of trained engineers and specialists has limited the delivery of impact. This directly addresses this need and will produce the largest global cohort of RAS specialists in Agri-Food.

The impacts are multiple and include;

1) Impact on RAS technology. The Agri-Food sector provides an ideal test bed to develop multiple technologies that will have application in many industrial sectors and research domains. These include new approaches to autonomy and navigation in field environments; complex picking, grasping and manipulation; and novel applications of machine learning and AI in critical and essential sectors of the world economy.

2) Economic Impact. In the UK alone the Made Smarter Review (2017) estimates that automation and RAS will create £183bn of GVA over the next decade, £58bn of which from increased technology exports and reshoring of manufacturing. Expected impacts within Agri-Food are demonstrated by the £3.0M of industry support including the world largest agricultural engineering company (John Deere), the multinational Syngenta, one of the world's largest robotics manufacturers (ABB), the UK's largest farming company owned by James Dyson (one of the largest private investors in robotics), the UK's largest salads and fruit producer plus multiple SME RAS companies. These partners recognise the potential and need for RAS (see NFU and IAgrE Letters of Support).

3) Societal impact. Following the EU referendum, there is significant uncertainty that seasonal labour employed in the sector will be available going forwards, while the demographics of an aging population further limits the supply of manual labour. We see robotic automation as a means of performing onerous and difficult jobs in adverse environments, while advancing the UK skills base, enabling human jobs to move up the value chain and attracting skilled workers and graduates to Agri-Food.

4) Diversity impact. Gender under-representation is also a concern across the computer science, engineering and technology sectors, with only 15% of undergraduates being female. Through engagement with the EPSRC ASPIRE (Advanced Strategic Platform for Inclusive Research Environments) programme, AgriFoRwArdS will become an exemplar CDT with an EDI impact framework that is transferable to other CDTs.

5) Environmental Impact. The Agri-food sector uses 13% of UK carbon emissions and 70% of fresh water, while diffuse pollution from fertilisers and pesticides creates environmental damage. RAS technology, such as robotic weeders and field robots with advanced sensors, will enable a paradigm shift in precision agriculture that will sustainably intensify production while minimising environmental impacts.

Publications

10 25 50

Studentship Projects

Project Reference Relationship Related To Start End Student Name
EP/S023917/1 01/04/2019 30/09/2031
2601734 Studentship EP/S023917/1 01/10/2021 30/09/2025 Samuel Carter