Integrating Haptic and Visual Data for Low Latency transmission
Lead Research Organisation:
King's College London
Department Name: Imaging & Biomedical Engineering
Abstract
Precisely determining the contact force during safe interaction in Minimally Invasive Surgery is still an open research challenge. Inspired by post-operative qualitative analysis from surgical videos, the use of cross-modality data driven deep neural network models has been one of the newest approach to predict sensorless force trends. The limited availability to vision-state-force datasets inspired the creation of a teleoperated laparoscopic forceps via robotic arm to recreate the forces of such tool when interacting with soft environments.
To solve this task, we use a laparoscopic forceps mounted on a robotic arm, that we teleoperate using a haptic device. We collect a dataset containing visual, robot state and force readings for different types of soft silicone environments. We make a deep analysis on how different artificial intelligence models predict forces, first on a randomized training scheme. As we have a big variability on our dataset, we test the adaptability of these models to different geometries, colors and stiffnesses, and how good the visual and state vectors help on this part. Addtionally, we determined the effectiveness of such models to adapt and learn from multiple dataset.
In the next part of my project, we decided to use generative deep learning techniques in order to model soft tissue dynamics from purely visual information. The approximation of such motion is done using modal coordinate analysis, really common in Finite Element Method simulations. We use the mathematical representation of the sinusoidal solutions on the frequency domain, using the Fast Fourier Transform, so we can further discretize the components and consider only a reasonable set of solutions.
To solve this task, we use a laparoscopic forceps mounted on a robotic arm, that we teleoperate using a haptic device. We collect a dataset containing visual, robot state and force readings for different types of soft silicone environments. We make a deep analysis on how different artificial intelligence models predict forces, first on a randomized training scheme. As we have a big variability on our dataset, we test the adaptability of these models to different geometries, colors and stiffnesses, and how good the visual and state vectors help on this part. Addtionally, we determined the effectiveness of such models to adapt and learn from multiple dataset.
In the next part of my project, we decided to use generative deep learning techniques in order to model soft tissue dynamics from purely visual information. The approximation of such motion is done using modal coordinate analysis, really common in Finite Element Method simulations. We use the mathematical representation of the sinusoidal solutions on the frequency domain, using the Fast Fourier Transform, so we can further discretize the components and consider only a reasonable set of solutions.
Organisations
Studentship Projects
Project Reference | Relationship | Related To | Start | End | Student Name |
---|---|---|---|---|---|
EP/V519546/1 | 30/09/2020 | 13/03/2026 | |||
2446549 | Studentship | EP/V519546/1 | 30/09/2020 | 29/09/2024 | Mikel De Iturrate Reyzabal |
Description | HK MedTech Hub launch |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Supporters |
Results and Impact | A group of 100 people fromed by relevant industry and business people attended to the presentation of the colaboration between King's and CAIR Hong Kong. During this visit, we presented the work we have been doing for the last years in both research institutes. We followed we a discussion about the future of healthcare technologies. |
Year(s) Of Engagement Activity | 2023 |
URL | https://www.kcl.ac.uk/news/kings-new-partnership-seeks-to-accelerate-medical-device-research |
Description | New Scientist Live |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | Regional |
Primary Audience | Public/other audiences |
Results and Impact | During this activity the university showed the different research carried on in our research group. More than 1000 people pass through our exposition, during a whole weekend. |
Year(s) Of Engagement Activity | 2022 |
URL | https://live.newscientist.com/exhibitors/kings-college-london |
Description | Surgical & Interventional Engineering lab visit |
Form Of Engagement Activity | Participation in an open day or visit at my research institution |
Part Of Official Scheme? | No |
Geographic Reach | Local |
Primary Audience | Policymakers/politicians |
Results and Impact | 60 people including politicians, practitioners and school board, attended to the opening of the new lab at St Thomas Hospital. |
Year(s) Of Engagement Activity | 2021 |