Virtual reality aid for ultrasound-guided needling

Lead Participant: MEDAPHOR LIMITED

Abstract

"We aim to revolutionise interventional ultrasound needling using augmented-reality and an artificial intelligence technique known as deep-learning.

During this project, we will develop and test an ultrasound needling assistant that:

1. Automatically locates the needle-tip using deep-learning (ie by using computer vision to find the needle-tip in the ultrasound image)
2. Uses an augmented reality headset worn by the clinician to project a holographic ultrasound view over the patient's anatomy highlighting the needle track and important anatomical structures in the correct position ""within"" the patient.

Doctors use interventional needling in a variety of medical procedures: for example, to biopsy tissues, to drain fluid, to insert cannulas, and to administer regional anaesthesia in a procedure known as a peripheral nerve block. Particularly when guiding the needle to deep structures, it is important that they do not damage other tissue. Clincians therefore need to be able to see the needle-tip. They often use ultrasound to do this since it is a safe imaging technique and the equipment can be brought to the bedside.

For many needling procedures, NICE recommends that ultrasound guidance always be used.

Ultrasound uses sound to visualise tissues. The ultrasound transducer emits a narrow beam of sound (rather like sonar on a submarine). Reflected sound is received by the machine and used to image the tissue it is reflected from.

Ultrasound needling is a difficult technique to master: the clinician must manipulate the needle and ultrasound transducer whilst looking at the ultrasound machine's screen (away from the patient) rather than at the patient themselves. They must manually keep the needle-tip within the ultrasound beam whilst advancing it towards its target. If the tip moves out of plane, it can become confused with the needle-shaft on the ultrasound image with potentially serious consequences.

Using augmented reality allows the ultrasound to be placed over the patient so the physical needle can simultaneously be seen, additionally highlighting the needle and important anatomical structures in the ultrasound view.

This will help the patient (by the clinician not missing the biopsy target and avoiding damage to adjacent structures), the health service (by reducing procedure time and cost) and the clinician (by reducing repetitive strain injury as the procedure will be more ergonomic).

These will give significant benefits to the NHS (both for patients and in improving the healthcare economics of ultrasound-guided needling) as well as significant export potential for a world-leading new digital health technology."

Lead Participant

Project Cost

Grant Offer

MEDAPHOR LIMITED £634,876 £ 444,413
 

Participant

INNOVATE UK
INTELLIGENT ULTRASOUND LTD
CARDIFF UNIVERSITY £21,530 £ 21,530

Publications

10 25 50