Towards perceptive and self-aware robots

Lead Research Organisation: University of Oxford
Department Name: Engineering Science

Abstract

Robots are typically applied in manufacturing and assembly processes to perform tasks such as welding, painting, assembly, pick and place and packing just to name a few. These tasks are usually executed in a confined space separated from the human workspace. However, researchers are pushing to make robots more collaborative and to bring them closer to humans, supporting them in different activities. This transition from industry to the human workspace is not easy: the human environment could be unstructured and cluttered, and robots are expected to perform several tasks without harming the humans or the environment itself.
To be able to safely execute operations in such conditions, robots must be equipped with proper sensing capabilities that allow them to detect the presence of objects or humans or track their motions to anticipate possible impacts. Furthermore, the robot should be able to explore the surrounding space by interacting with it and manipulate and recognise objects. In this case, it becomes paramount to properly control the forces arising during the contact.
These two aspects are usually treated separately in the state of the art. Indeed, most of the work proposes to control robots to avoid collisions using cameras or lidar sensors [1-3]. However, impacts cannot always be avoided, and robots could be required to properly deal with them to safely interact with the environment and humans. Recent work has shown that contact with the environment can be measured with tactile sensors and exploited to perform tasks in unstructured space [4-6] or to identify or handle objects [7-11]. Although these studies propose methods to apply controlled interactions, they do not consider the problem of approaching objects in a controlled manner to avoid high-impact forces.
In this respect, it would be of interest to fuse information provided by diverse types of sensing modalities to compensate for the lack of one feedback with another one. As an example, proximity and tactile feedback could be exploited together to predict in advance a possible collision and to control the robot to measure and minimise the resulting impact force. Similarly, proximity information can be used while exploring the object with the sense of touch to approach the surface of interest in a proper way.
This research project will focus on the development of methods that take advantage of multimodal feedback to properly control the robot in the execution of tasks in unstructured environments. Multimodal sensing will be exploited to perceive the surroundings and to detect contacts occurring on the robot body. In particular, the use of Time of Flight (ToF) and tactile sensors will be considered. The two sensing modalities will be combined to provide the robot with awareness of its surrounding area. The idea is to build an augmented model of the environment where ToF and tactile sensors can be used to retrieve the shape of the objects and their mechanical properties. This model can be refined and updated over time by controlling the robot to perform movements aimed at exploring new areas or to distinguish between fixed or movable objects and measure their relative speed.
Machine learning techniques can be then leveraged to process and interpret the events that change the state of the model. For example, the robot can discriminate between a possible dangerous collision and a possible interaction with humans and trigger a proper reaction in turn. More in general, this would allow for implementing methods where multimodal feedback is interpreted and the robot can be controlled to perform different actions.
This project is well aligned with the "EPSRC artificial intelligence and robotics" theme as it aims to develop new methodologies that will enable robots to perceive the environment, the way it evolves and behave accordingly.

Publications

10 25 50

Studentship Projects

Project Reference Relationship Related To Start End Student Name
EP/W524311/1 30/09/2022 29/09/2028
2742381 Studentship EP/W524311/1 30/09/2022 30/03/2026 Giammarco Caroleo