Machine Learning for the Planning and Evaluation of Image-guided Ablation of Solid-Organ Tumours

Lead Research Organisation: University College London
Department Name: Medical Physics and Biomedical Eng


Brief description of the context of the research including potential impact
Tumours of solid organs (for example liver, kidney or prostate) make up a significant proportion of cancers affecting humans. One potential treatment option for these tumours is the use of minimally invasive techniques that allow the destruction of abnormal tissue using needles inserted through the skin, guided by medical imaging modalities such as computed tomography (CT).

However, medical imaging techniques are susceptible to artefacts (errors in the represented image) and changes in image quality and resolution. In the case of CT, this is often dependent on X-ray dose and the contrast agents injected intravenously to highlight blood vessels. Exposure to these needs to be kept to a minimum, but this can adversely affect image quality and resolution and potentially make the procedure more difficult. In addition, the artefacts from the needles used can severely degrade the CT image.

A rapidly expanding field in computational medical imaging is that of deep learning, the use of algorithms based on neural networks that give computers the ability to learn and improve performance based on experience. Some of these algorithms are particularly well-suited to images and it is thought that, using the high-resolution CT images gained at the start of the procedure, the algorithms can learn how to improve the low-resolution images, a process known as super-resolution. By improving the quality of these images used to guide the needles (avoiding increased exposure to higher doses of X-rays and contrast agents), it should be possible to achieve improved accuracy of the treatment, potentially allowing more effective ablation of cancerous tissue while limiting unwanted side effects.

Aims and objectives
To determine whether deep learning can be used to improve image quality in image-guided minimally invasive treatments.
Objectives (MRes year):
1. Literature review and obtain anonymised imaging data
2. Research and plan appropriate algorithms
3. Writing of prototype code
4. Test prototype code on the imaging data
5. MRes project report write up
Novelty of the research methodology
Super-resolution has been attempted previously using image registration algorithms. These aim to map a high-resolution image onto a low-resolution image to achieve a similar result. This project however will be the first intra-operative application for super-resolution. In addition, the use of deep learning to tackle this problem provides a framework for further work using the same data, such as image artefact removal and prediction of outcome. Deep learning is well-suited to both tasks, and the above methodology can be used as a basis for tackling related problems such as these.
In addition, applying deep learning to a large interventional radiological dataset will be a significant contribution, as this is normally a difficult application due to lack of data and significant intra-dataset variability due to the operators, approaches and instruments used.

Alignment to EPSRC's strategies and research areas
Therapeutic imaging during treatment to increase effectiveness and improve patient outcomes
Automated extraction and/or integration of existing and additional information from clinical data/images (e.g. via machine learning and/or mathematical science techniques)
Accelerating research impact in this area through strong engagement with relevant stakeholders throughout research and training programmes (working with real patient data from clinical procedures, along with close involvement with a multidisciplinary team who have transnational experience)

Any companies or collaborators involved


10 25 50

Studentship Projects

Project Reference Relationship Related To Start End Student Name
EP/S515255/1 30/09/2018 29/03/2023
2138766 Studentship EP/S515255/1 23/09/2018 22/09/2022 Mark Pinnock