Learning to manipulate cloth-like objects

Lead Research Organisation: University of Edinburgh
Department Name: Sch of Informatics

Abstract

Manipulating objects such as cloth (and ropes) is challenging for robots due to the complexity of the dynamics of these objects. For instance, most such objects will exhibit phenomena including wrinkles, folds, and crumpling or buckling - all of which are hard to model exactly, and difficult to predict in real time. This also causes perceptual problems, such as when parts of the cloth are obscured by itself, making it difficult to accurately observe and represent the current state of the object.

Working in this domain, the core concerns of this research will include the following questions: How should we model the dynamics of such objects and what representations are tractable for learning and planning? Then, how should these representations be integrated with planning and control, with a view to practical implementation of manipulation tasks on robotic systems?

The first question requires us to start from defining appropriate sensing modalities, along with object representations - including consideration of analytical or numerical simulation-based models and data-driven machine learning based models. In this work, we expect to use monocular, stereo and depth cameras as the perceptual modalities, and various structured neural network-based architectures for the interpretation of this data.

The graphics community have extensively studied simulation and visualisation of cloth-like objects, although many of these implementations are not suited for real-time computation. Furthermore, many such models are driven more by objectives of efficient visualisation rather than realistic physical behaviour, in part as a means of coping with computational limitations. We will investigate variations of such models, alongside the use of machine learning methods - both for efficient re-al-time emulation of the object dynamics and for better calibration of models to observational data, with a view to subsequent real-time control.

This research will also investigate cloth manipulation strategies for bi-manual robots, using such models. This will include investigation of model-based reinforcement learning or model-predictive control architectures. This will include investigation into new integrated perception, planning and control architectures that leverage structural properties of the models to achieve computational efficiency and robustness of behaviour.

Experiments in this project will be based on domestic and assistive use cases. Example robot behaviours would be folding clothes, making a bed, tucking into bed and dressing. In this domain, one of the challenges will be to ensure that the robot can perform its task safely whilst compensating for the actions that a human agent may perform in turn. So, the metrics on which the overall system could be evaluated include task achievement and efficiency of human-robot interaction. We envision initial experiments on the Baxter robot, and a custom bi-manual robot platform using UR10 arms and multi-fingered hands. This facilitates multi-point grasping and dynamic manipulations such as flinging, tugging, etc.

Publications

10 25 50

Studentship Projects

Project Reference Relationship Related To Start End Student Name
EP/T517884/1 01/10/2020 30/09/2025
2670230 Studentship EP/T517884/1 01/01/2022 30/06/2025 Jack Rome