Implicit human-robot interfacing in robotic surgery
Lead Research Organisation:
Imperial College London
Department Name: Computing
Abstract
A key motivation for this work is to generate methods for robotic control and navigation that feel transparent to the user. These interfaces should act implicitly based on the context of the surgical scene, as opposed to requiring explicit user input. Firstly, an implicit approach to control the surgical camera and a focussed energy delivery system is explored, using gaze information and no active gestures. This approach is then expanded for intention recognition, to dynamically adapt motion scaling of master-slave systems. Bayesian approaches to tune these interfaces are also studied, showing the importance of not relying on manual tuning. The methods previously developed for intention recognition are then expanded and applied to augmented reality, to provide optimal display behaviour in robotic applications. Finally, an approach is developed combining augmented reality and force feedback to provide implicit assistance for hand-held robotic instruments. Throughout this work, results from studies conducted for each of the proposed methods demonstrate the capabilities and potential clinical value of context-aware interfaces to improve performance.
Medical robotics
Medical robotics
Organisations
People |
ORCID iD |
Guang-Zhong Yang (Primary Supervisor) | |
Gauthier Gras (Student) |
Studentship Projects
Project Reference | Relationship | Related To | Start | End | Student Name |
---|---|---|---|---|---|
EP/N509486/1 | 30/09/2016 | 30/03/2022 | |||
2291711 | Studentship | EP/N509486/1 | 30/09/2016 | 29/09/2017 | Gauthier Gras |