Service Robot Adaptation to Users with Different Abilities

Lead Research Organisation: King's College London
Department Name: Informatics


As modern robots become increasingly present in our workspaces, public spaces, and homes, we want to facilitate easier and more natural human-robot interaction and collaboration. One of the key challenges is to recognize the fact that users may be significantly diverse from one another. We therefore wish to explore how to recognize this diversity and enable robots to infer the capabilities and plans of their collaborator. By doing so, the robots can know what to expect from their collaborator and can plan to maximise their assistance, but also adapt online to benefit the collaborator and the interaction.

The key research objectives are determining how we can leverage implicit physiological and social signals elicited by humans in an interaction and understand them as feedback for the robot to act upon. We want to incorporate this feedback into an adaptive framework, by which the robot learns to adapt and tailor its policies to the user. Furthermore, we want to infer the person's capabilities from observations, and reason about them when computing plans for shared tasks. We want to implement these frameworks in collaborative tasks such as block stacking and a gamified cooking task.

To achieve these objectives, I will first conduct a study linking human signals to mental states in a human-robot interaction scenario, investigating whether signals such as eye gaze provide information about the human's mental state. Then, I will take the findings from the study and incorporate the signals as feedback signals in a learning agent who is trying to optimize its behaviour and tailor it to its collaborator. Lastly, I will develop a learning algorithm that will observe a person's trajectories and construct an underlying symbolic model which explains the observed trajectories and use this model to plan its actions in an interaction, complementing those of the human.

The expected novel contributions are three frameworks: a framework that uses signals from the human to adapt robot's behaviour in a human-robot interaction, a framework which builds human models that explain observed behaviours and is then able to reason about the human's capabilities, and lastly, a framework unifying these two elements in a single unified reasoning system which is able to adapt online, but also reason about the collaborator during planning of the robot's actions.

This PhD is relevant to the following EPSRC research area is Artificial Intelligence and Robotics.


10 25 50

Studentship Projects

Project Reference Relationship Related To Start End Student Name
EP/S023356/1 31/03/2019 29/09/2027
2605778 Studentship EP/S023356/1 30/09/2021 29/09/2025 Peter Tisnikar