Robotic picking and packing with physical reasoning

Lead Research Organisation: University of Leeds
Department Name: Sch of Computing


This Fellowship focuses on robotic object manipulation. Object manipulation refers to all the different ways robots can interact with objects in their environments. Consider the example of packing different items into a box in a warehouse to ship the box to a customer. To perform this, a robot would need to pick and insert the items into the box one by one, while nudging, pushing and squeezing objects, to achieve a tight packing. The robot would need to plan and control its actions, as well as use sensors to estimate the positions and deformations of the objects in the box.

The dominant approach to object manipulation in the literature is geometry-based, where the world is represented using shapes and configurations only. While this simplifies planning and control, it also results in robots that are extremely limited in their skills. The central vision of this Fellowship is to go beyond that, by enabling robots to reason about, plan in, and control the full physics of the world. This has the potential to transform robots' object manipulation skills and our lives, because robots will be able to perform a much diverse variety of object manipulation skills applicable to manufacturing, assembly, and services.

This Fellowship will create fundamental algorithms --- algorithms that can be applied to different object manipulation problems by other researchers and engineers. However, this Fellowship will also target a particular application area: picking and packing of objects for warehouse automation. With the rapid advance of e-commerce over the past decade, there is a pressing need to have efficient warehouse automation systems, in the UK and the world. However, existing robotic systems do not have physics-based reasoning, which limits their applications drastically. The physics-based picking and packing approach that I propose will enable robots to reach into cluttered bins/shelves/bags, pushing, nudging, and squeezing arbitrary objects to search and retrieve a particular object or to pack multiple objects tightly for shipping --- skills that do not exist in any current system.

There are significant challenges to using physics-based models during robotic manipulation. An important one is computational expense. We have low-level physics models and physics engines, similar to the ones used in computer games, which can be used by robots. However, computing such models are expensive (i.e. takes too much computer time) and robotic algorithms need to query such models thousands, and sometimes millions, of times before choosing an action, making it infeasible to use such a straightforward approach. Instead, I propose to develop and use hierarchical models of physics. At higher levels in this hierarchy are coarse, approximate physics models, i.e. models that are computationally cheap (i.e. fast to compute) but may be inaccurate. At lower levels in this hierarchy are fine models, i.e. models that are computationally expensive (i.e. slow to compute) but are accurate. I will investigate a variety of methods (including data-driven methods as well as parallel computing methods) to learn and compute such a hierarchy of physics models. I will also develop new planning, control, and state estimation algorithms that can use these new hierarchical physics models.

I will also use these new algorithms and systems to accelerate the adoption of this technology in the UK. I will work with the EPSRC UK Robotics & Autonomous Systems Network and industrial stakeholders to develop a roadmap for the integration of autonomous picking and packing robots into the existing industrial workflows. I will also aim to create a new national organisation to focus on this important technology.

To achieve these aims, I will work with many academic and industrial partners, including the Advanced Supply Chain Group, a leading UK-based supply chain and warehouse management company, as well as a key international company in this area, Amazon.