ViTac: Visual-Tactile Synergy for Handling Flexible Materials

Lead Research Organisation: University of Liverpool
Department Name: Computer Science

Abstract

Handling flexible materials is common in industrial, domestic and retail applications, e.g., evaluating new fabric products in the fashion industry, sorting clothes at home and replenishing shelves in a clothing store. Such tasks have been highly dependent on human labour and are still challenging for autonomous systems due to the complex dynamics of flexible materials. This proposal aims to develop a new visuo-tactile integration mechanism for estimating the dynamic states and properties of flexible materials while they are being manipulated by robot hands. This technique offers the potential to revolutionise the autonomous systems for handling flexible materials, allowing inclusion of their automated handling in larger automated production processes and process management systems. While the initial system to be developed in this work is for handling the textiles, the same technology would have the potential to be applied in handling other flexible materials including fragile products in the food industry, flexible objects in manufacturing and hazardous materials in healthcare.

Planned Impact

``AI and the data driven economy'' has been set out as the first of the four Grand Challenges in the UK government's Industrial Strategy. Robotics and Autonomous Systems have been identified as key priority areas to tackle this Grand Challenge. A lot of resources have been spent in the industry to manipulate and evaluate flexible products, as emphasised by Unilever, and handling such objects is becoming a bottleneck for including their automated handling in larger automated processes. The goal of this project is to provide transformative solutions that help robots to perform automated handling of flexible objects in the context of the manufacturing industry, which aligns well with the priority of the EPSRC "Productive nation" and the "Manufacturing the future'" theme.

The project contributes to the EPSRC research areas of Robotics and AI (RAI) technologies and follows its strategic focus on the accelerated deployment of RAI technologies in next-generation manufacturing: the developed handling system will be implemented in autonomous material assessment and handling, which will be used by Unilever and potentially other companies. It will also address the "Challenge: Deliver safer, smarter engineering" of the Alan Turing Institute.

This project complements existing EPSRC projects "MAN$^3", "RAIN", "NCNR", "FAIR-SPACE", "ORCA" on robotic handling of materials, by proposing the visuo-tactile coordination for handling flexible materials, and EPSRC projects "Tactile superresolution sensing" and "NeuPRINTSKIN" by extending the tactile sensing technologies to robotic handling applications. The project also complements EU projects "CloPeMa", "I-DRESS" and "CLOTHILDE", by combining vision and touch for clothes perception and manipulation.
 
Description In this award, we have developed a set of sensor prototypes as well as algorithms to enable robots to handle flexible flexible materials, which is a common task in industrial, domestic and retail applications.

First of all, we have developed several vision-based tactile sensor prototypes named GelTip sensors. The GelTip uses a camera to capture the deformation of the soft membrane above it. Through this design, the GelTip can capture the detailed information of the flexible materials in contact with the sensor, such as textures, force distribution and geometry. We have investigated different form factors of the sensor so that it can be suitable for handling flexible materials. The sensor has also been integrated with the vision system of the robot so that the robot can have a visuo-tactile sensing system to perceive better the state of the flexible materials.

Furthermore, we have developed a simulation model for the GelTip sensors so that we can train the robot agents in simulation before deploying them in the real experiments. This accelerates the training of robot agents and saves potential risks of damaging the tactile sensing in excessive use of the sensors in experiments.

In addition, we have developed a set of algorithms for robot perception of flexible materials as well as better planning of manipulation actions in handling them. We have created a benchmark to compare the roles of vision, proprioceptive sensing and tactile sensing in these tasks. We have designed Reinforcement Learning agents to learn from these data to manipulate flexible objects. Interestingly, we have found that vision has played a significant role in perceiving the state of the flexible object, while tactile sensing at the robot gripper provides better cues for fine manipulation of these objects.
Exploitation Route The outcomes of this funding would be of interest to the researchers and practitioners, who work with flexible materials, sensor development, autonomous systems and artificial intelligence algorithms. Our sensor prototypes and algorithms developed in this award can be used in various applications, e.g., using robots to evaluate new fabric products in the fashion industry, sort clothes at home and replenish shelves in a clothing store.
Sectors Agriculture, Food and Drink,Construction,Digital/Communication/Information Technologies (including Software),Electronics,Healthcare,Manufacturing, including Industrial Biotechology,Pharmaceuticals and Medical Biotechnology,Retail

 
Description Sensory evaluation involves a group of human testers evaluating new products via human senses like vision and touch. It is a necessary part of both product development and quality control for many companies. However, the sensory evaluation may have biases due to factors like the fatigue of the human testers. In addition, intensive training has to be given to the testers to develop requisite expertise of the job. Through this award, we have developed an autonomous handling system for estimating the properties of flexible materials, to evaluate new products. This project has helped companies to mitigate the biases in sensory evaluation of products, to hold their positions in intensified market competition and contribute to their growth.
First Year Of Impact 2022
Sector Manufacturing, including Industrial Biotechology
Impact Types Economic

 
Description Collaboration with Southeast University, China 
Organisation Southeast University China
Country China 
Sector Academic/University 
PI Contribution Tubular objects such as test tubes are common in chemistry and life sciences research laboratories, and robots that can handle them have the potential to accelerate experiments. Moreover, it is expected to train a robot to manipulate tubular objects in a simulator and then deploy it in a real-world environment. However, it is still challenging for a robot to learn to handle tubular objects through single sensing and bridge the gap between simulation and reality. In this collaboration, together with our partners, we propose a novel tactile-motor policy learning method to generalize tubular object manipulation skills from simulation to reality. In particular, we propose a Sim-to-Real transferable in-hand pose estimation network that generalizes to unseen tubular objects. The network utilizes a novel adversarial domain adaptation network to narrow the pixel-level domain gap for tactile tasks by introducing the attention mechanism and a task-related constraint.
Collaborator Contribution The in-hand pose estimation network is further implemented by our partners in a Reinforcement Learning-based policy learning framework for robotic insert-and-pullout manipulation tasks. The proposed method is applied to a human-robot collaborative tube placing scenario and a robotic pipetting scenario. The experimental results demonstrate the generalization capability of the learned tactile-motor policy toward tubular object manipulation in research laboratories.
Impact Yongqiang Zhao, Xingshuo Jing, Kun Qian, Daniel Fernandes Gomes, Shan Luo, Skill generalization of tubular object manipulation with tactile sensing and Sim2Real learning, Robotics and Autonomous Systems, Volume 160, 2023, 104321, ISSN 0921-8890, https://doi.org/10.1016/j.robot.2022.104321.
Start Year 2021
 
Description Collaboration with Unilever R&D Port Sunlight 
Organisation Unilever
Department Unilever UK R&D Centre Port Sunlight
Country United Kingdom 
Sector Private 
PI Contribution Our team have worked with the researchers at Unilever Port Sunlight to apply our expertise in robot perception to assess fabric and hair care products with artificial touch feel. In this collaboration, our team have developed a bespoke vision based tactile sensor that can slip over the hair and fabric samples, like how a human finger does when assessing the touch feel of these objects, and collect data to analyse the mechanical properties of these objects and align with human touch feel attributes.
Collaborator Contribution In this collaboration, our partners at Unilever provided us their expertise and experience in assessing fabric and hair samples. When assessing new formulas of their fabric and hair care products, Unilever recruit a Human Sensory Panel to evaluate the properties of fabrics or hair samples after using the new formulas and score them with their touch feel. Our partners provided their data and experimental protocols in assessing the fabric or hair samples, as well as their expertise in analysing the data fro assessing these samples.
Impact - Sensor prototypes used for assessment of fabrics and hair samples used at Unilever Port Sunlight - Further funding from Unilever. - Datasets collected from the experiments in assessing fabrics and hair samples.
Start Year 2021
 
Description 2021 UK-China Symposium on Advanced Manufacturing 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Around 100 researchers attended the UK-China Symposium on Advanced Manufacturing online. There were a few talks from different fields of robotics and manufacturing. I gave the keynote talk at the event.
Year(s) Of Engagement Activity 2021
 
Description Invited keynote talk at the French research group in Robotics (GdR) open days 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact I was invited to give a keynote talk at the French research group in Robotics (GdR) open days. Over 100 researchers attended the open days, which sparked questions and discussions afterwards, and the attendees reported fresh views on these research topics covered in the event.
Year(s) Of Engagement Activity 2022
URL https://www.gdr-robotique.org/
 
Description Invited talk at the Centre for Vision, Speech and Signal Processing (CVSSP), Surrey University 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Professional Practitioners
Results and Impact I gave an invited talk at the Centre for Vision, Speech and Signal Processing (CVSSP), Surrey University
Year(s) Of Engagement Activity 2021
 
Description Invited talk at the Sino-EU Conference on Intelligent Robots and Automation, 2021 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact I gave an invited talk at the Sino-EU Conference on Intelligent Robots and Automation, 2021.
Year(s) Of Engagement Activity 2021
 
Description Invited talk at the University of Birmingham 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Professional Practitioners
Results and Impact I gave an invited talk at the Intelligent Robotics lab at the University of Birmingham.
Year(s) Of Engagement Activity 2021
 
Description Participation in an activity, workshop or similar - EPSRC Circular economy and ICT engagement and networking workshop 
Form Of Engagement Activity A formal working group, expert panel or dialogue
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Professional Practitioners
Results and Impact In this event, the information and communication technologies (ICT) and circular economy research communities were brought together, fostering relationships to encourage greater consideration of circularity and resource efficiency in ICT research and research outcomes, increase awareness and further develop understanding of the role of ICT research in achieving a circular economy and identify community highlights and future priorities to feed into related EPSRC and UK Research and Innovation (UKRI) strategy development and understand the barriers to delivering high impact, interdisciplinary research in this area.
Year(s) Of Engagement Activity 2023
URL https://www.ukri.org/events/circular-economy-and-ict-engagement-and-networking-workshop/
 
Description Participation in an activity, workshop or similar - Invited talk at the 4th UK Manipulation Workshop, 2023 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Postgraduate students
Results and Impact I gave a talk on our research in visuo-tactile perception in this ViTac award at the 4th UK Manipulation Workshop
Year(s) Of Engagement Activity 2023
URL https://www.robot-manipulation.uk/
 
Description Participation in an activity, workshop or similar - UK-RAS meeting 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Professional Practitioners
Results and Impact I participated the UK-RAS meeting in Leeds in 2022.
Year(s) Of Engagement Activity 2022
 
Description Participation in an activity, workshop or similar - ViTac 2023: Blending Virtual and Real Visuo-Tactile Perception 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Postgraduate students
Results and Impact I organise the "ViTac 2023: Blending Virtual and Real Visuo-Tactile Perception" workshop at the International Conference on Robotics and Automation, London, 2023. I also give a talk at the workshop on our research done in the ViTac award.
Year(s) Of Engagement Activity 2023
URL https://shanluo.github.io/ViTacWorkshops/
 
Description ViTac 2021: Trends and Challenges in Visuo-Tactile Perception 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact In this one-day workshop we brought together experts from the diverse range of disciplines and encompass engineers, computer scientists, cognitive scientists and sensor developers, to discuss topics relating to fusion of vision and touch sensing, which is highly related to the ViTac project. I was the leading organiser of the workshop.
Year(s) Of Engagement Activity 2021
URL http://wordpress.csc.liv.ac.uk/smartlab/icra-2021-vitac-workshop/