An insect-inspired approach to robotic grasping

Lead Research Organisation: University of Edinburgh
Department Name: Sch of Informatics

Abstract

To be really useful, robots need to interact with objects in the world. The current inability of robots to grasp diverse objects with efficiency and reliability severely limits their range of application. Agriculture, mining and environmental clean-up arejust three examples where - unlike a factory - the items to be handled could have a huge variety of shapes and appearances, need to be identified amongst clutter, and need to be grasped firmly for transport while avoiding damage. Secure grasp of unknown objects amongst clutter remains an unsolved problem for robotics, despite improvements in 3Dsensing and reconstruction, in manipulator sophistication and the recent use of large-scale machine learning.

This project proposes a new approach inspired by the high competence exhibited by ants when performing the closely equivalent task of collecting and manipulating diverse food items. Ants have relatvely simple, robot-like 'grippers' (their mouth-parts, called 'mandibles'), limited sensing (mostly tactile, using their antennae) and tiny brains. Yet they are able to pick up and carry a wide diversity of food items, from seeds to other insect prey, which can vary enormously in shape, size, rigidity and manouverability. They can quickly choose between multiple items and find an effective position to make their grasp, readjusting if necessary. Replicating even part of this competence on robots would be a significant advance. Grasping thus makes an ideal target for applying biorobotic methods that my group has previously used with substantial success to understand and mimic insect navigation behaviours on robots.

How does an ant pick up an object? The first part of this project will be to set up the methods required to observe and analyse in detail the behaviour of ants interacting with objects. At the same time we will start to build both simulated and real robot systems that allow us to imitate the actions of an ant as it positions its body, head and mouth to make a grasp; using an omnidirectional robot base with an arm and gripper. We will also examine and imitate the sensory systems usedby the ant to determine the position, shape and size of the object before making a grasp.

What happens in the ant's brain when it picks up an object? The second part will explore what algorithms insect brains need to compute to be able to make efficient and effective grasping decisions. Grasping is a task that contains in miniature many key issues in robot intelligence. It involves tight coupling of physical, perceptual and control systems. It involves a hierarchy of control decisions (whether to grasp, how to position the body and actuators, precise contact, dealing with uncertainty, detecting failure). It requires fusion of sensory information and transformation into the action state space, and involves prediction, planning and adaptation. We aim tounderstand how insects solve these problems as a route to efficient and effective solutions for robotics.

Can a robot perform as well as an ant? The final part will test the systems we have developed in real world tasks. The first task will be to perform an object clearing task, which will also allow benchmarking of the developed system against existing research. The second task will be based ona pressing problem in environmental clean-up: detection and removal of small plastic items from amongst shoreline rocksand gravel. This novel area of research promises significant pay-off from translating biological understanding into technical advance because it addresses an important unsolved challenge for which the ant is an ideal animal model.

Publications

10 25 50
 
Description Insect-inspired depth perception
Amount £548,115 (GBP)
Funding ID EP/X019705/1 
Organisation Engineering and Physical Sciences Research Council (EPSRC) 
Sector Public
Country United Kingdom
Start 01/2023 
End 01/2027
 
Title Multicamera system to film ant grasping 
Description A five camera array using macrolenses has been designed and constructed to enable close-up, high-speed filming of ants grasping food items, from which 3D tracking of antennal, head and mandible motion can be extracted. It incorporates a small chamber that can be accessed by ants, and constant lighting. 
Type Of Material Improvements to research infrastructure 
Year Produced 2022 
Provided To Others? No  
Impact None as yet. 
 
Title Model of ant anntennal sensing and grasp synthesis 
Description A model has been implemented in Matlab that provides a simplified kinematic representation of the ant head and antennae. This is used to simulated alternative antennal contact patterns and synthesise grasp attempts. 
Type Of Material Computer model/algorithm 
Year Produced 2023 
Provided To Others? No  
Impact None as yet 
 
Description Ocado 
Organisation Ocado Technology
Country United Kingdom 
Sector Private 
PI Contribution Research on the grasping behaviour of ants which could be applicable to robotic tasks.
Collaborator Contribution Consultancy on the problems for automating grocery delivery, specifically picking tasks with high variation. Going forward, placements and internships for staff on the project at Ocado will be supported.
Impact No outcomes as yet.
Start Year 2021
 
Description Sheffield 
Organisation University of Sheffield
Country United Kingdom 
Sector Academic/University 
PI Contribution We have established a joint project and will be modelling the retinal and receptor movements of insect eyes in software and hardware to determine the potential for extracting depth information.
Collaborator Contribution My partner, Mikko Juusola in the School of Biosciences will be carrying out behavioural and neurophysiological investigations of the fly visual system, and complementary approaches to modelling.
Impact We have submitted an EPSRC research proposal, which has been funded (commenced February 2023). This includes two industrial project partners, Opteran and Festo. The collaboration is multidisciplinary, combining biology and computer vision.
Start Year 2021