An insect-inspired approach to robotic grasping
Lead Research Organisation:
University of Edinburgh
Department Name: Sch of Informatics
Abstract
To be really useful, robots need to interact with objects in the world. The current inability of robots to grasp diverse objects with efficiency and reliability severely limits their range of application. Agriculture, mining and environmental clean-up arejust three examples where - unlike a factory - the items to be handled could have a huge variety of shapes and appearances, need to be identified amongst clutter, and need to be grasped firmly for transport while avoiding damage. Secure grasp of unknown objects amongst clutter remains an unsolved problem for robotics, despite improvements in 3Dsensing and reconstruction, in manipulator sophistication and the recent use of large-scale machine learning.
This project proposes a new approach inspired by the high competence exhibited by ants when performing the closely equivalent task of collecting and manipulating diverse food items. Ants have relatvely simple, robot-like 'grippers' (their mouth-parts, called 'mandibles'), limited sensing (mostly tactile, using their antennae) and tiny brains. Yet they are able to pick up and carry a wide diversity of food items, from seeds to other insect prey, which can vary enormously in shape, size, rigidity and manouverability. They can quickly choose between multiple items and find an effective position to make their grasp, readjusting if necessary. Replicating even part of this competence on robots would be a significant advance. Grasping thus makes an ideal target for applying biorobotic methods that my group has previously used with substantial success to understand and mimic insect navigation behaviours on robots.
How does an ant pick up an object? The first part of this project will be to set up the methods required to observe and analyse in detail the behaviour of ants interacting with objects. At the same time we will start to build both simulated and real robot systems that allow us to imitate the actions of an ant as it positions its body, head and mouth to make a grasp; using an omnidirectional robot base with an arm and gripper. We will also examine and imitate the sensory systems usedby the ant to determine the position, shape and size of the object before making a grasp.
What happens in the ant's brain when it picks up an object? The second part will explore what algorithms insect brains need to compute to be able to make efficient and effective grasping decisions. Grasping is a task that contains in miniature many key issues in robot intelligence. It involves tight coupling of physical, perceptual and control systems. It involves a hierarchy of control decisions (whether to grasp, how to position the body and actuators, precise contact, dealing with uncertainty, detecting failure). It requires fusion of sensory information and transformation into the action state space, and involves prediction, planning and adaptation. We aim tounderstand how insects solve these problems as a route to efficient and effective solutions for robotics.
Can a robot perform as well as an ant? The final part will test the systems we have developed in real world tasks. The first task will be to perform an object clearing task, which will also allow benchmarking of the developed system against existing research. The second task will be based ona pressing problem in environmental clean-up: detection and removal of small plastic items from amongst shoreline rocksand gravel. This novel area of research promises significant pay-off from translating biological understanding into technical advance because it addresses an important unsolved challenge for which the ant is an ideal animal model.
This project proposes a new approach inspired by the high competence exhibited by ants when performing the closely equivalent task of collecting and manipulating diverse food items. Ants have relatvely simple, robot-like 'grippers' (their mouth-parts, called 'mandibles'), limited sensing (mostly tactile, using their antennae) and tiny brains. Yet they are able to pick up and carry a wide diversity of food items, from seeds to other insect prey, which can vary enormously in shape, size, rigidity and manouverability. They can quickly choose between multiple items and find an effective position to make their grasp, readjusting if necessary. Replicating even part of this competence on robots would be a significant advance. Grasping thus makes an ideal target for applying biorobotic methods that my group has previously used with substantial success to understand and mimic insect navigation behaviours on robots.
How does an ant pick up an object? The first part of this project will be to set up the methods required to observe and analyse in detail the behaviour of ants interacting with objects. At the same time we will start to build both simulated and real robot systems that allow us to imitate the actions of an ant as it positions its body, head and mouth to make a grasp; using an omnidirectional robot base with an arm and gripper. We will also examine and imitate the sensory systems usedby the ant to determine the position, shape and size of the object before making a grasp.
What happens in the ant's brain when it picks up an object? The second part will explore what algorithms insect brains need to compute to be able to make efficient and effective grasping decisions. Grasping is a task that contains in miniature many key issues in robot intelligence. It involves tight coupling of physical, perceptual and control systems. It involves a hierarchy of control decisions (whether to grasp, how to position the body and actuators, precise contact, dealing with uncertainty, detecting failure). It requires fusion of sensory information and transformation into the action state space, and involves prediction, planning and adaptation. We aim tounderstand how insects solve these problems as a route to efficient and effective solutions for robotics.
Can a robot perform as well as an ant? The final part will test the systems we have developed in real world tasks. The first task will be to perform an object clearing task, which will also allow benchmarking of the developed system against existing research. The second task will be based ona pressing problem in environmental clean-up: detection and removal of small plastic items from amongst shoreline rocksand gravel. This novel area of research promises significant pay-off from translating biological understanding into technical advance because it addresses an important unsolved challenge for which the ant is an ideal animal model.
People |
ORCID iD |
| Barbara Webb (Principal Investigator / Fellow) |
| Description | We have established that a simple parallel plate gripper can have greatly enhanced stability with the addition of hairs that mimic those observed on the ant's mandibles. We have observed that ants make significant use of their antennae and front legs to position an object before grasping with their mandibles, and plan to build similar augmentation for a robot gripper. We have also developed new software interfaces that greatly ease the problem of merging information obtained when filming insects simultaneously from multiple points of view, ultimately allowing 3D reconstruction of the motion of their body, limbs and antennae. |
| Exploitation Route | The outcomes should improve robot grasping of unknown objects, which has multiple applications in industry, agriculture commerce, construction and energy sectors. |
| Sectors | Agriculture Food and Drink Construction Energy Environment Manufacturing including Industrial Biotechology Retail |
| Description | Our results using an ant-inspired gripper design with hairs has been considered for use by Ocado. |
| First Year Of Impact | 2024 |
| Sector | Retail |
| Impact Types | Economic |
| Description | Insect-inspired depth perception |
| Amount | £548,115 (GBP) |
| Funding ID | EP/X019705/1 |
| Organisation | Engineering and Physical Sciences Research Council (EPSRC) |
| Sector | Public |
| Country | United Kingdom |
| Start | 02/2023 |
| End | 01/2027 |
| Title | Multicamera system to film ant grasping |
| Description | A five camera array using macrolenses has been designed and constructed to enable close-up, high-speed filming of ants grasping food items, from which 3D tracking of antennal, head and mandible motion can be extracted. It incorporates a small chamber that can be accessed by ants, and constant lighting. |
| Type Of Material | Improvements to research infrastructure |
| Year Produced | 2022 |
| Provided To Others? | No |
| Impact | None as yet. |
| Title | Ant grasp video dataset |
| Description | This database is a set of videos of untethered ants grasping objects, filmed at 100FPS from 5 Views, with cameras synchronised and calibrated (intrinsics and extrinisics). The database includes written descriptions of the behaviour captured in each video and is labelled according to object interaction category. Additional data will include 2D Pose Tracking of the videos using SLEAP, and 3D Pose reconstruction using a method developed in the GRASP project. |
| Type Of Material | Database/Collection of data |
| Year Produced | 2025 |
| Provided To Others? | No |
| Impact | We plan to make this video collection available to the public before the end of the project, and it should be of interest both to behavioural biologists and to robotics researchers interested in bio-inspired object identification and grasping. |
| URL | https://insectrobotics.github.io/ant-grasp-dataset/ |
| Title | MiniMarket77 |
| Description | The MiniMarket77 dataset is made up of 1200 partial view samples of 77 worldwide available market objects. Each view sample contains 2048 coloured points and saved in HFD5 format. Where a partial view RGB-D camera output is less than 2048 points, the sample is padded with zeros. These samples are obtained using 8 different RGB-D cameras namely the popular Realsense D435 and D415 models. |
| Type Of Material | Database/Collection of data |
| Year Produced | 2025 |
| Provided To Others? | Yes |
| Impact | There is a lack of real world object point cloud datasets, and most of the few available are captured using very expensive hardware. Here we use the widely popular Realsense cameras to capture the point clouds, so that models trained on this dataset should be useable in relatively cheap set-ups, which should impact on usability in real object segmentation/grasping applications. |
| URL | https://www.kaggle.com/datasets/b50381fad51ff7691c5cc3fd7a510f6a7a1bb74156bc1a3267d674f5e8cb03ea |
| Title | Model of ant anntennal sensing and grasp synthesis |
| Description | A model has been implemented in Matlab that provides a simplified kinematic representation of the ant head and antennae. This is used to simulated alternative antennal contact patterns and synthesise grasp attempts. |
| Type Of Material | Computer model/algorithm |
| Year Produced | 2023 |
| Provided To Others? | No |
| Impact | None as yet |
| Description | Amazon |
| Organisation | Amazon.com |
| Department | Amazon Science |
| Country | Germany |
| Sector | Private |
| PI Contribution | We are investigating the grasping behaviour of ants and believe this could provide novel methods for improving grasping behaviour by robots, relevant to Amazon's operations. |
| Collaborator Contribution | Amazon have presented to us in detail one of their core tasks and the current capabilities and problems of the existing solutions. This is very helpful in directing our attention towards aspects of ant grasping that might be most useful or straightforward to deploy to robotics. |
| Impact | No outputs as yet. |
| Start Year | 2021 |
| Description | Ocado |
| Organisation | Ocado Technology |
| Country | United Kingdom |
| Sector | Private |
| PI Contribution | Research on the grasping behaviour of ants which could be applicable to robotic tasks. |
| Collaborator Contribution | Consultancy on the problems for automating grocery delivery, specifically picking tasks with high variation. Going forward, placements and internships for staff on the project at Ocado will be supported. |
| Impact | No outcomes as yet. |
| Start Year | 2021 |
| Description | Sheffield |
| Organisation | University of Sheffield |
| Country | United Kingdom |
| Sector | Academic/University |
| PI Contribution | We have established a joint project and will be modelling the retinal and receptor movements of insect eyes in software and hardware to determine the potential for extracting depth information. |
| Collaborator Contribution | My partner, Mikko Juusola in the School of Biosciences will be carrying out behavioural and neurophysiological investigations of the fly visual system, and complementary approaches to modelling. |
| Impact | We have submitted an EPSRC research proposal, which has been funded (commenced February 2023). This includes two industrial project partners, Opteran and Festo. The collaboration is multidisciplinary, combining biology and computer vision. |
| Start Year | 2021 |
| Title | Antgrip |
| Description | We mimick the internal hairs found on ant mandibles to improve performance of a two-finger parallel plate robot gripper. With bin picking applications in mind, the gripper fingers are long and slim, with interchangeable soft gripping pads augmented by 20mm hairs, made of thermoplastic polyurethane (TPU), that can be arranged in various patterns and angles. In our experiments, by adding hairs, the grasp success rate was increased by at least 29%, and the number of objects that remain securely gripped during manipulation more than doubled. |
| Type Of Technology | New/Improved Technique/Technology |
| Year Produced | 2024 |
| Impact | The gripper has drawn interest from our project partners, Ocado and Amazon, and further potential for its application in industry is being explored through the National Robotarium. |
| URL | https://arxiv.org/abs/2312.05364 |
| Title | Mokap: An easy to use but powerful multi-camera acquisition software |
| Description | Mokap is an easy to use multi-camera acquisition software developed for animal behaviour recording using hardware-triggered (synchronised) machine vision cameras. It features: cross platform compatibility (Linux, Windows, macOS); supports synchronised cameras (only using a Raspberry Pi for now, but other modes will come soon); supports encoding to individual frames or straight to video (with or without GPU encoding); supports live camera calibration for 3D triangulation; agnostic to choice of feature tracking method. |
| Type Of Technology | New/Improved Technique/Technology |
| Year Produced | 2025 |
| Open Source License? | Yes |
| Impact | Many research groups are interested in 3D tracking of small animals to analyse behaviour, but despite significant advances in tracking software, the practical requirements for recording from multiple synchronised cameras with appropriate calibration are usually dealt with ad-hoc and in-house. We have already had multiple enquiries from other research groups interested to use the solution we have developed in their systems. |
| Description | Embodied AI podcast |
| Form Of Engagement Activity | Engagement focused website, blog or social media channel |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Public/other audiences |
| Results and Impact | Barbara Webb was interviewed for the The Embodied AI Podcast, promoted by the British Neuroscience Association. |
| Year(s) Of Engagement Activity | 2022 |
| URL | https://anchor.fm/the-embodied-ai-podcast |
| Description | Insect Spatial Awareness and Environmental Manipulation Workshop |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Postgraduate students |
| Results and Impact | This was a two-day workshop (2-3 May 2024) in Edinburgh organised through the GRASP project, gathering together researchers who are studying how insects perceive, encode and react with their immediate surroundings, attended by around 30 researchers from the UK and Europe. Research topics including active sensing, tool use, nest building, object transport, and prey catching. The format strongly encouraged postgraduate (rather than PI) presentation of results and fostered discussion on how this under-explored aspect of insect behaviour could be advanced in future. |
| Year(s) Of Engagement Activity | 2024 |