Object Comprehension via Robotic In-Hand Manipulation & Tactile Sensing

Lead Research Organisation: Imperial College London
Department Name: Electrical and Electronic Engineering

Abstract

Whilst humans largely rely on vision to recognise objects, there are certain object properties that can only be reliably determined through the use of touch, such as roughness, elasticity, and weight distribution. When manipulating objects, we often also rely on a combination of vision and touch: Imagine trying to throw away a banana you found in your cupboard, then upon touching it promptly realising the insides have long turned liquid - you would then perhaps attempt to carefully pick it up by the stem instead. Sometimes, the shape of an object is even determined solely through touch - Imagine searching for a key from your trouser pocket full of other random objects. Not only can we quickly locate and grab the key, we are also capable of swiftly re-aligning the key within our hand before pulling it out of the pocket. Haptic perception helps us comprehend the mechanical properties of objects, which we use to test against the assumptions we made through vision and/or our memory, and ultimately plays an essential role in how we decide to manipulate objects.

This PhD project focuses on object comprehension through robotic In-Hand-Manipulation(IHM) in combination with tactile sensing. By 'object comprehension', we mean a process similar to "3D object reconstruction" (which is the process of reconstructing the shape and appearance of objects from visual images), but with haptic data as additional (or only) inputs and spatial mapping of mechanical properties as additional outputs.

An initial goal is to develop a custom 2-finger, variable-width-palm, robotic manipulation platform that is able to perform IHM while recording tactile data from sensors mounted along the finger lengths. The tactile data, and internal states of the robotic hand will be used to construct a model of the object based on shape and mechanical properties. For this first part of the PhD, research efforts will focus on developing appropriate control algorithms to enable efficient data collection and reconstruction using this custom gripper.

In the second part of the PhD, a learning-based approach to data acquisition techniques will be pursued, with the determination of appropriate haptic 'exploratory techniques' inspired by the recognition of such processes in human systems. A robust and adaptive control system will be implemented and trained through reinforcement learning, enabling the hand/gripper to adaptively decide on the optimal exploration actions whilst manipulating objects through IHM. This system has the goal of minimising exploration time and maximising exploration accuracy. It is likely that in order to facilitate the development of more complex exploratory procedures, we will need to develop a more complex gripper with additional fingers and/or articulation. The findings of this second part will explore the feasibility and potential of combining IHM and haptic exploration for objects comprehension and recognition, and will set a benchmark for future studies in this area.

The results of this project will include novel automated IHM object exploration and handling techniques utilising tactile data, that can help perform advanced manipulation tasks such as that in the introductory finding-and-aligning-key-in-pocket example, and ultimately lead to safer, more compliant, more efficient, and more versatile robotic IHM.

People

ORCID iD

Xin Zhou (Student)

Publications

10 25 50

Studentship Projects

Project Reference Relationship Related To Start End Student Name
EP/T51780X/1 01/10/2020 30/09/2025
2620864 Studentship EP/T51780X/1 01/10/2021 31/03/2025 Xin Zhou