Multimodal active sensing and perception for human-robot collaborative.

Lead Research Organisation: University of Bath
Department Name: Electronic and Electrical Engineering

Abstract

Significant amounts of work have been done on robotic perception from individual sensors, along with study into the fusion of multiple sensor modes for more reliable event classification. Active sensing, where a robot adjusts its physical interaction with the world to enhance the reliability of sensory data, has shown promising results. The combination of multimodal sensor fusion and active sensing has had limited investigation but is key to giving robust robotic perception. Taking inspiration from biology, humans do not simply use one mode of sensing the environment but all available. At the same time we do not statically observe the world but actively interact to better understand what is not obviously apparent, as succinctly stated by Gibson "we do not just see, we look" (Gibson 1973).

The use of current industrial and other robotic platforms is limited by safety concerns. To make the most of the technology, robotic platforms must be fully integrated into our working and living environments with seamless interaction, not caged off behind safety barriers or reduced in capability when in proximity to humans. This project will aim to enhance the collaboration between humans and robots by enabling safer interaction in a more natural way. This will hopefully be achieved by improved sensing of the environment fused through machine learning techniques to give more robust perception and reliable decision making.

This project will aim to deliver a system for safe human-robot collaboration. The main objectives are broken down as the following:
Develop wearable device with inertial measurement unit (IMU) and electromyography (EMG) sensor fusion for movement recognition and classification.
Fuse wearable device data with further sensor types, primarily vision, for natural human robot interaction in an unstructured environment.
Development of computational intelligence methods, primarily artificial neural network focussed, where robust perception and safe behaviour can be guaranteed allowing close human-robot collaboration.
Integrate system into a Kuka robot arm platform and other applications demonstrating cross platform validity of core computational methods.

Reliable robotic sensing and perception methods for use in human interaction scenarios is key to allowing robots to become common in our everyday lives. One of the large economic drivers for such development comes from the push to achieve 'industry 4.0' (Othman 2016). The use of current robotic systems is focussed on 'automatic' systems with predefined motions and stringent safety regulation for human interaction. To truly make the most of such systems they must become 'autonomous', allowing for safe human interaction in unstructured environments. The development of such technology has many further use cases, such as elderly or disabled care, driverless vehicles and household assistants, to name a few. The need for development in areas associated with this project is reflected in the research council's desire to grow or maintain levels of research in the areas of 'Robotics', 'Manufacturing Technologies', 'Assistive technologies' and 'Artificial Intelligence Technologies'.

Publications

10 25 50

Studentship Projects

Project Reference Relationship Related To Start End Student Name
EP/R513155/1 01/10/2018 30/09/2023
2281852 Studentship EP/R513155/1 01/10/2019 31/03/2023 James MALE