Perceptual Docking for Robotic Control (Equipment Rich Proposal)

Lead Research Organisation: Imperial College London
Department Name: Institute of Biomedical Engineering

Abstract

The quest for protecting humans from direct exposure to hazardous or safety critical environments has been the driving force behind technological developments in robotics. Whilst robotics and other related technologies continue to grow, researchers across the globe are turning their attention to what is perhaps the most challenging safety critical environment of all / the human body. Deploying robots around and within the human body, particularly for robotic surgery presents a number of unique and challenging problems that arise from the complex and often unpredictable environments that characterise the human anatomy. Master-slave based robots such as the daVinci system, which embodies the movements of trained minimal access surgeons through motion scaling and compensation, are gaining clinical significance. Micro-machines possessing sensors and actuators based on the MEMS technology are also rapidly emerging. Under the dichotomy of autonomous and manipulator technologies in robotics, intelligence of the robot is typically pre-acquired through high-level abstraction and environment modelling. For procedures that involve complex anatomy and large tissue deformation, however, this is known to create major difficulties. The regulatory, ethical and legal barriers imposed on interventional surgical robots also give rise to the need of a tightly integrated control between the operator and the robot when autonomy is being pursued. The aim of this project is to research into a new concept of perceptual docking for robotic control. The word docking is different in meaning to the conventional term used in mobile robots. It represents a fundamental paradigm shift of perceptual learning and knowledge acquisition for robotic systems in that operator specific motor and perceptual/cognitive behaviour is assimilated in situ through a gaze contingent framework. We hypothesise that saccadic eye movements and ocular vergence can be used for attention selection and recovering 3D motion and deformation of the soft tissue during MIS procedures. It is expected that the method will also open up a range of completely new opportunities for effective human-machine interaction. This proposal seeks equipment and research funding for establishing an integrated core experimental facility at Imperial College for investigating key challenges related to perceptual docking in robotics and allied research issues related to human-machine interaction, visual perception and machine learning, ergonomics, kinematics and actuation design, intra-operative image guidance, motion and biomechanical modelling, tissue-instrument interaction, and robotic control. Matching funding from the Institute of Biomedical Engineering at Imperial has already been secured, and the facility is expected to greatly enhance the multidisciplinary research capacity and facilitate the interaction and initiation of new research programmes across a number of different disciplines.

Publications

10 25 50

publication icon
Lerotic M (2007) Super resolution in robotic-assisted minimally invasive surgery. in Computer aided surgery : official journal of the International Society for Computer Aided Surgery

publication icon
Lerotic M (2007) Super resolution in robotic-assisted minimally invasive surgery in Computer Aided Surgery

publication icon
Merrifield R (2015) Surgical Robot Challenge 2015 [Competitions] in IEEE Robotics & Automation Magazine

publication icon
Merrifield R (2017) U.K. Robotics Week [Competitions] in IEEE Robotics & Automation Magazine

publication icon
Noonan DP (2009) Laser-induced fluorescence and reflected white light imaging for robot-assisted MIS. in IEEE transactions on bio-medical engineering

publication icon
Stoyanov D (2008) Intra-Operative Visualizations: Perceptual Fidelity and Human Factors in Journal of Display Technology

 
Description The aim of this project is to research into a new concept of perceptual docking for robotic control. It represents a fundamental paradigm shift of perceptual learning and knowledge acquisition for robotic systems in that operator specific motor and perceptual/cognitive behaviour is assimilated in situ through a gaze contingent framework. We hypothesise that saccadic eye movements and ocular vergence can be used for attention selection and recovering 3D motion and deformation of the soft tissue during robotically assistred surgical procedures. The project has resulted in a range gaze contingent (eye-controlled) robotic control platforms based on binocular eye tracking for anatomical registration, motion stabilisation, active constraints and dynamic motor channelling. The project has openned up many new opportunities for effective human-machine interaction for robotically assisted surgery.
Exploitation Route The use of eye control for human computer/machine interaction has a range of applications from entertainment, general computer interfacing, controlling of mobile devices, to assistive technologies. This is an equipment rich grant for establishing an integrated core experimental facility at Imperial College for investigating key challenges related to perceptual docking in robotics and allied research issues related to human-robot interaction.
Sectors Digital/Communication/Information Technologies (including Software)

Healthcare

URL http://www.imperial.ac.uk/hamlyn
 
Description The project has allowed the establishment of the research facility on perceptual docking at Imperial College and strengthened the collaboration with industry, particularly Intuitive Surgical Inc. This has led to joint patent development and initiation of new UK/EU projects for developing a range of new surgical robot platforms.
First Year Of Impact 2008
Sector Digital/Communication/Information Technologies (including Software),Healthcare
Impact Types Societal

 
Description ARAKNES
Amount € 1,381,868 (EUR)
Funding ID FP7 224565 
Organisation European Commission 
Sector Public
Country European Union (EU)
Start 04/2008 
End 04/2012
 
Description Robotic Assisted Microsurgery (Wolfson Foundation)
Amount £3,000,000 (GBP)
Organisation The Wolfson Foundation 
Sector Charity/Non Profit
Country United Kingdom
Start 02/2010 
End 04/2012
 
Description Robotic Assisted Surgical Guidance and Visualisation
Amount £879,943 (GBP)
Funding ID DT/E011101/1 
Organisation Engineering and Physical Sciences Research Council (EPSRC) 
Sector Public
Country United Kingdom
Start 05/2007 
End 11/2010
 
Description Royal Academy of Engineering Fellowship
Amount £485,228 (GBP)
Funding ID Danail Stoyanov 
Organisation Royal Academy of Engineering 
Sector Charity/Non Profit
Country United Kingdom
Start 03/2009 
End 03/2014
 
Description INTERACT workshop on HCI and HRI 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Professional Practitioners
Results and Impact Workshop hosted by Sheffield Robotics to explore potential collaborations between the Human-Robot Interaction and Human-Computer Interface research communities. Attendees: 43 delegates, 23 UK universities represented. Outcome: ambition for the universities of Sheffield, Hertfordshire and Nottingham to apply for an EPSRC Network to keep the momentum going, network members identified
Year(s) Of Engagement Activity 2016