Research on robot-assisted minimally invasive surgical device with 3D image navigation

Lead Research Organisation: University College London
Department Name: Medical Physics and Biomedical Eng

Abstract

1) Brief description of the context of the research including potential impact
Cancer is one of the leading causes of death in countries around the world and a significant barrier to increasing life expectancy, an estimated 19.3 million new cancer cases and almost 10.0 million cancer deaths occurred in 2020. Robotic-assisted minimally invasive surgery is a new surgical technique that uses modern medical instruments to pass through small wounds on the human body surface to perform operation, identifying the early signs of cancer and removing tumour tissues. Robot-assisted minimally invasive surgery has become mainstream because of its clinical importance in terms of low risk of post-operative complications, short post-operative recovery time and small wounds.
2) Aims and Objectives
However, finding the target and performing complex surgery through a two-dimensional (2D) visual display of an endoscopic video stream would occasionally lead to disorientation. To this end, this study will be committed to addressing two challenges to boost the robot-assisted minimally invasive surgery.
Challenge 1: Low-quality endoscopic images
The poor illumination usually results in low-quality endoscopic images, which will affect the determination of the surgical environment by the computer.
Challenge 2: Surgical instruments localization and segmentation
Robot-assisted minimally invasive surgical instruments include endoscopic cameras and surgical tools. Intraoperative segmentation and localization of instruments are very important components of endoscopic navigation. However, owing to the poor imaging environment in the human body, especially the soft tissue deforming, image guided endoscopic navigation still has much room for improvement in accuracy and application.
3) Novelty of Research Methodology
This study will design a robot-assisted minimally invasive surgical device consisting of a monocular camera arranged around micro-LEDs which can be mounted on a surgical tool to provide 3D scene navigation for robot-assisted minimally invasive surgery. A swallowable capsule endoscopy that can reconstruct 3D scene has been developed, which can be swallowed by the patient prior to surgery and a three-dimensional view of the patient's body is reconstructed so that the surgeon can locate the lesion. In combination with SLAM technology, the surgical instruments are positioned inside the patient in real time. The illumination of the entire procedure can always be adjusted by a matrix of brightness values of the micro-LEDs using real-time image data to achieve uniform illumination.
(1) Combination of SLAM and instrument segmentation
This system is split into 2 parallel threads. One thread is used to estimate the camera pose and the deformation of the scene, while the other thread is applied to the segmentation and positional estimation of the surgical tools to better fit the internal cavity deformation scene for surgical navigation.
(2) Environment adaptive illumination LED array
Adjustment of brightness of LED array is applicable to implement uniform illumination. Implantable and wireless control micro-LED devices provide potential approaches to solving the problem of poor illumination.
4) Alignment to EPSRC's strategies and research areas
This work aims to use computer vision technology, artificial intelligence, embedded development technology etc. to address engineering challenges in robotic surgery which matches the EPSRC Health Technologies Strategy called minimally invasive autonomous technologies for robotic surgery. This work aims to use computer vision, artificial intelligence, and embedded development techniques to address engineering challenges in robotic surgery, which aligns with the innovative technologies for physical intervention such as assistive technologies and surgical robotics in EPSRC Health Technologies Strategy.
5) Any companies or collaborators
No companies or collaborators.

Publications

10 25 50

Studentship Projects

Project Reference Relationship Related To Start End Student Name
EP/S021930/1 01/10/2019 31/03/2028
2885460 Studentship EP/S021930/1 01/10/2023 30/09/2027 Xu Wang