MAN^3: huMAN-inspired robotic MANipulation for advanced MANufacturing

Lead Research Organisation: Queen Mary University of London
Department Name: Sch of Electronic Eng & Computer Science

Abstract

Across the past 50 years, the use of robots in industry has monotonically increased, and it has literally boomed in the last 10 years. In 2016, the average robot density (i.e. number of robot units per 10,000 employees) in the manufacturing industries worldwide was 74; by regions, this was 99 units in Europe, 84 in the Americas and 63 in Asia, with an average annual growth rate (between 2010 and 2016) of 9% in Asia, 7% in the Americas and 5% in Europe. From 2018 to 2020, global robot installations are estimated to increase by at least 15% on average per year.
The main market so far has been the automotive industry (i.e. an example of heavy manufacturing), where simple and repetitive robotic manipulation tasks are performed in very controlled settings by big and expensive robots, in dedicated areas of the factories where human workers are not allowed to enter for safety reasons. New growing markets for robots are consumer-electronics and food/beverages (i.e. examples of light manufacturing) as well as other small and medium sized enterprises (SMEs): in particular, the food and beverage industry has increased robot orders by 12% each year between 2011 and 2015, and by 20% in 2016. However, in many cases the production processes of these industries require delicate handling and fine manipulations of several different items, posing serious challenges to the current capabilities of commercial robotic systems.
With 71 robot units per 10,000 employees (in 2016), the UK is the only G7 country with a robot density below the world average of 74, ranking 22nd. The industry and SME sector is highly in need of a modernization that would increase productivity and improve the working conditions (e.g. safety, engagement) of the human workers: this requires the development and deployment of novel robotic technologies that could meet the needs of those businesses in which current robots are yet not effective.

One of the main reasons why robots are not effective in those applications is the lack of robot intelligence: the ability to learn and adapt that is typical of humans. Indeed, robotic manipulation can be enhanced by relying on humans, both through interaction (i.e. humans as direct teachers) and through inspiration (i.e. humans as models).

Therefore, the aim of this project is to develop a system for natural human demonstration of robotic manipulation tasks, combining immersive Virtual Reality technologies and smart wearable devices (to interface the human with the robot) with robot sensorimotor learning techniques and multimodal artificial perception (inspired by the human sensorimotor system). The robotic system will include a set of sensors that allow to reconstruct the real world, in particual by integrating 3D vision with tactile information about contacts; the human user will access this artificial reconstruction through an immersive Virtual Reality that will combine both visual and haptic feedback. In other words, the user will see through the eyes of the robot, and will feel through the hands of the robot. Also, users will be able to move the robot just by moving their own limbs. This will allow human users to easily teach complex manipulation tasks to robots, and robots to learn efficient control strategies from the human demonstrations, so that they can then repeat the task autonomously in the future.

Human demonstration of simple robotic tasks has already found its way to industry (e.g. robotic painting, simple pick and place of rigid objects), but still it cannot be applied to the dexterous handling of generic objects (e.g. soft and delicate objects), that would result in a much larger applicability (e.g. food handling). Therefore, the expected results of this project will boost productivity in a large number of industrial processes (economic impact) and improve working conditions and quality of life of the human workers in terms of safety and engagement (social impact).

Planned Impact

The project will impact different parts of our society: 1) academia and research; 2) economic sectors; 3) general public.

1) The research outcomes of the project will target different communities, namely: robotics, computer vision, tactile/force sensing, machine learning, automatic control, Virtual Augmented Reality, Human-Computer Interaction. Results will go beyond the state of the art of those subjects and will be disseminated through: publications in the top robotics (e.g. IJRR, T-RO, RAS, AURO), sensors (e.g. IEEE Sens. J., Sens. Actuators, Sensors), machine learning and computer vision (e.g. JMLR, Neural Networks, PAMI) journals and presentations to both national (e.g. TAROS, BMVC, IEEE UKRI RAS Chapter Conference) and international (e.g. ICRA, IROS, RSS, NIPS, ICML, IEEE Sensors, HRI) conferences; organisation of workshops at international robotics conferences (e.g. ICRA, IROS); organisation of national workshops (e.g. BMVA, the UK Manipulation Workshop, EPSRC UK-RAS network events).

2) The project will impact at least two categories of stakeholders in the secondary and tertiary economic sectors, i.e.: advanced manufacturers (industry and SMEs) and providers of robotics and AI solutions. These are the "two sides" looking into intelligent robots as an integral part of their future businesses.
Many SMEs and Industries in UK are willing to introduce robots on their production lines to improve their production (e.g. increase throughput and flexibility, reduce costs): they target collaborative robots, that can share the workspace with human workers and do not require the whole production lines to be modified. While low-cost collaborative robot arms are now available on the market, what is still missing are (i) intelligent manipulation capabilities in the end-effectors, and (ii) easy and intuitive ways to program the robots. These aspects still account for as much as 80 percent or more of the total deployment costs, hampering the successful introduction of collaborative robots in industry and SMEs. Remarkably, our research aim is to fill this gap by advancing the state of the art of human-demonstrated intelligent robotic manipulation.
Moreover, although we target manufacturing, the solutions that we will investigate in the project are beneficial for a wide range of additional applications, including manipulaiton in hazardous environments (e.g. chemical, radioactive), space and deep sea exploration, agriculture (e.g. precision weeding and picking), service and assistive robots at home and in public spaces.

3) Through public engagement activities, the project will raise awareness about how the proposed technologies will change the landscape of industrial production toward: a safer and more rewarding working experience for the human workers (released from the most dangerous, tiresome and repetitive aspects of their jobs) and an increased throughput and flexibility of the production (that generates higher income for the companies and therefore the possibility to employ more human workers in engaging and well-paid jobs, focused on the creativity and decision-making aspects). Indeed, through public engagement activities we aim to mitigate the increasing fear and discomfort about the mass introduction of robots in industry (i.e. one of the key pillars of Industry 4.0) in a part of the UK population, who is afraid to lose their jobs and economic power, with a negative impact on their quality of life; this fear is caused by a misguided conception of the real objective of robotics research in this area, which is not to replace humans with robots, but to improve the quality of the human work by using robots, both in terms of personal satisfaction of the worker and of overall quality of the production. In addition, by generating further understanding and interest in STEM subjects, we also aim to encourage young people into STEM studies and careers, regardless their gender, nationalities and social/ethnic groups.

Publications

10 25 50

publication icon
Denoun B (2021) Grasping Robot Integration and Prototyping: The GRIP Software Framework in IEEE Robotics & Automation Magazine

publication icon
Frazier PA (2020) Plant Bioinspired Ecological Robotics. in Frontiers in robotics and AI

publication icon
Helfenstein J (2022) An approach for comparing agricultural development to societal visions. in Agronomy for sustainable development

 
Description A) We have developed a novel software framework that facilitates the programming and deployment of robotic manipulation tasks [1].

B) We have used different kinds of tactile and force sensors to improve the quality of autonomous robotic manipulation and to collect information about the manipulated objects: recognizing grasped objects by tactile information only [2]; detecting if the grasped objects are slipping from the gripper [3]; predicting if the object is likely to slip before lifting it, by haptic exploration (i.e. touching the object multiple times), and optimising the position of the robot fingertips on the object so that the probability of the object to fall after it has been grasped and lifted is minimised [4]; estimating the position and size of small cracks/apertures on a metallic sheet [5] and classify apples and strawberries as either ripe and rotten [6] by tactile scan, i.e. swiping a soft force sensor on the surface and analysing the contact data; reconstructing the 3D shape of an unknown object or surface [13,14].

C) We proposed a novel approach to teach in-hand manipulation actions to a robot by physically moving the robot fingers while the robot is holding one object (without the object falling); the robot can then reproduce the same action, or variations of the same action, also with different objects [7]. For example, we can teach the robot how to rotate a cylinder of 90 degrees (e.g. unscrewing the cap of a plastic bottle), and then the robot can use this knowledge to rotate a different object (e.g. a bolt) of 45 degrees.

D) We demonstrated how a set of inexpensive devices (i.e. Leap Motion, LilyPad board and Vibe motors) can be used for tracking the movements of the human fingers while giving haptic feedback in the form of vibrations [11]. This can be used to teleoperate a robotic hand or to interact with a virtual reality. In one experiment, participants were asked to virtually "squeeze" different objects visualised on a computer screen, by simply moving their fingers, and they could successfully tell which of two objects was softer (or harder) basing their judgement only on the haptic feedback they received through the vibrations [8]. In a different experiment, participants were asked to virtually "rub" different surfaces (i.e. surfaces made of different materials, which therefore had different textures) by simply moving one finger back and forth, without being able to see the specific surface, and they could successfully tell (i.e. better than chance) what material the "real" surface was made of [9]. In addition, we integrated other devices (Oculus Rift VR headset, Kinect depth cameras) to realise a more immersive teleoperation of a robot manipulator [10]. Initial experiments show that these techniques can be used also to teach a robot how to manipulate objects, and could therefore extend the results we obtained in [7]. One key step to realize this is to be able to automatically segment different components of a human demonstration [12]: for example, during a demonstration in which the human user teleoprates the robot to pick and place a banana from one bin to another, the demonstration can be segmented into approaching, picking, transporting, and placing the banana; the picking part might be further segmented into a non-prehensile manipulation stage (in which one banana is singled out from a clutter of bananas) and then the actual picking/lifing stage; this segmentation is fundamental because the collected data can be then used to inform (e.g. train) different robot controllers that deal with different actions (e.g. either pikcing or transporting).

[1] B. Denoun, B. Leon, M. Hansard, L. Jamone, Grasping Robot Integration and Prototyping: the GRIP Software Framework. IEEE Robotics and Automation Magazine 28(2), 101-111, 2021.
[2] E. Kirby, R. Zenha and L. Jamone, Comparing Single Touch to Dynamic Exploratory Procedures for Robotic Tactile Object Recognition. IEEE Robotics and Automation Letters. Early Access. DOI: 10.1109/LRA.2022.3151261. 2022.
[3] R. Zenha, B. Denoun, C. Coppola, L. Jamone, Tactile Slip Detection in the Wild Leveraging Distributed Sensing of both Normal and Shear Forces. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2021.
[4] M. S. Siddiqui, C. Coppola, G. Solak, L. Jamone, Grasp Stability Prediction for a Dexterous Robotic Hand combining Depth Vision and Haptic Bayesian Exploration. Frontiers in Robotics and AI 8, 237, 2021.
[5] P. Ribeiro, S. Cardoso, A. Bernardino and L. Jamone, Fruit quality control by surface analysis using a bio-inspired soft tactile sensor. In the IEEE-RSJ International Conference on Intelligent Robots and Systems (IROS). Las Vegas, USA. Best Paper Finalist. 2020.
[6] P. Ribeiro, S. Cardoso, A. Bernardino and L. Jamone, Highly sensitive bio-inspired sensor for fine surface exploration and characterization. In the IEEE-RAS International Conference on Robotics and Automation (ICRA). Paris, France, 2020.
[7] G. Solak and L. Jamone, Learning by Demonstration and Robust Control of Dexterous In-Hand Robotic Manipulation Skills. In the IEEE-RSJ International Conference on Intelligent Robots and Systems (IROS). Macau, China, 2019.
[8] B. Junput, X. Wei and L. Jamone, Feel it on your Fingers: Dataglove with Vibrotactile Feedback for Virtual Reality and Telerobotics. In the Towards Autonomous Robotic Systems Conference (TAROS). London, UK, 2019.
[9] B. Junput, I. Farkhatdinov and L. Jamone, Touch it, rub it, feel it! Haptic rendering of physical textures with a low cost wearable system. In the Towards Autonomous Robotic Systems Conference (TAROS). Nottingham, UK, 2020.
[10] B. Omarali, B. Denoun, K. Althoefer, L. Jamone, M. Valle and I. Farkhatdinov, Virtual Reality based Teleoperation Framework for Remote Environment Exploration with Depth Cameras. In the IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). Naples, Italy, 2020.
[11] C. Coppola, G. Solak and L. Jamone, An affordable system for the teleoperation of dexterous robotic hands using Leap Motion hand tracking and vibrotactile feedback. In the IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). Naples, Italy, 2022.
[12] C. Coppola and L. Jamone, Master of Puppets: Multi-modal Robot Activity Segmentation from Teleoperated Demonstrations. IEEE International Conference on Development and Learning (ICDL). London, UK, 2022.
[13] A. Bonzini, L. Seminara and L. Jamone, Improving Haptic Exploration of Object Shape by Discovering Symmetries. In the IEEE-RAS International Conference on Robotics and Automation (ICRA). Philadelphia, USA, 2022.
[14] A. Bonzini, L. Seminara, S. Macciò, A. Carfì and L. Jamone, Leveraging symmetry detection to speed up haptic object exploration in robots. IEEE International Conference on Development and Learning (ICDL). London, UK, 2022.
Exploitation Route 1) Other researchers in the field can now use the commercial devices that we have tested to conduct additional experiments; industrial stakeholders can use them for telerobotic and virtual reality applications.
2) Other researchers in the field or industrial stakeholders can use how open source (ROS compatible) software to teach in-hand manipulations to robots and reproduce them.
3) Other researchers in the field or industrial stakeholders can reproduce our sensors (their design is open source) and apply them to similar tasks.
Sectors Aerospace, Defence and Marine,Agriculture, Food and Drink,Construction,Digital/Communication/Information Technologies (including Software),Leisure Activities, including Sports, Recreation and Tourism,Manufacturing, including Industrial Biotechology,Retail

URL https://man3project.blogspot.com/
 
Description One objective of the project was to futher develop technologies for tactile sensing in robotics, or in other words, "electronic skin" technology. Leveraging additional support from UKRI Impact Acceleration funds available at QMUL, we developed a prototype of a novel electronic skin that was integrated on robotic hands and grippers and tested in our lab for the task of grasping and manipulating delicate foor items; a patent application has been filed and it is currently pending; in addition, the prototype was integrated in a robotic gripper developed by the UK company Wootzano and tested within their pipeline for the automatic robotic packaging of grapes. The premilinary results show that this innovative solution may improve the performance of the robotic packaging, resulting in economic impact.
First Year Of Impact 2022
Sector Agriculture, Food and Drink
Impact Types Economic