Gentler Imaging
Lead Research Organisation:
University of Sheffield
Department Name: Physics and Astronomy
Abstract
Imaging living cells using optical microscopy is challenging. The light used in optical microscopy often damages the biological system under investigation, and the better resolution we require, the more light is needed. This makes studing the important living processes that occur both rapidly and at the size limits of what can be observed with light microscope almost impossible without the development of new, innovative solutions.
Often a single microscope can perform a single imaging modality, so a researcher wanting to image at a number of different resolutions / light intensities would have to move the sample between microscopes. We propose a versatile imaging platform that can switch between imaging modalities. This is useful because it allows us to image a biological system with minimal light, until the event we are interested in occurs. At the time something interesting happens we then want to automatically switch between the low resolution low damage mode to a high resolution, higher photon flux mode - specially in the area where the interesting event is occurring. To automate this process we will train a neural network to study the low resolution images as they are produced and under specific conditions swap to the high resolution mode. This will dramatically reduce the amount of photo damage, allowing the biological event to be observed in much greater detail than previously possible.
The example we will use as a proof of concept is the engulfment and killing of microbes by amoebae. This happens in much the same way that our immune cells use to clear the body of pathogens to protect us from infection. The amoeba cells provide a convenient model to understand this, as well as a simple test system for microscope development. The engulfment process, known as phagocytosis, occurs when amoeba or immune cells touch their prey and enwrap them. This will be used to act as a trigger to stimulate the imaging platform to switch to the higher resolution modality, specifically around the microbe being eaten. This will prevent photodamaging the sample prior to engulfment, and allow us to focus the precious light precisely when and where it is needed to understand how engulfment and killing occur. This spatial and temporally selective high-resolution imaging will then be combined with novel camera technology to ensure that each pixel in the image is captured at the optimal signal-to-noise. This will allow users to ensure that the maximal information is obtained from each experiment. Combined, these technologies will dramatically improve our ability to observe a host of important biological events, by overcoming the main limitation of current microscopy.
Often a single microscope can perform a single imaging modality, so a researcher wanting to image at a number of different resolutions / light intensities would have to move the sample between microscopes. We propose a versatile imaging platform that can switch between imaging modalities. This is useful because it allows us to image a biological system with minimal light, until the event we are interested in occurs. At the time something interesting happens we then want to automatically switch between the low resolution low damage mode to a high resolution, higher photon flux mode - specially in the area where the interesting event is occurring. To automate this process we will train a neural network to study the low resolution images as they are produced and under specific conditions swap to the high resolution mode. This will dramatically reduce the amount of photo damage, allowing the biological event to be observed in much greater detail than previously possible.
The example we will use as a proof of concept is the engulfment and killing of microbes by amoebae. This happens in much the same way that our immune cells use to clear the body of pathogens to protect us from infection. The amoeba cells provide a convenient model to understand this, as well as a simple test system for microscope development. The engulfment process, known as phagocytosis, occurs when amoeba or immune cells touch their prey and enwrap them. This will be used to act as a trigger to stimulate the imaging platform to switch to the higher resolution modality, specifically around the microbe being eaten. This will prevent photodamaging the sample prior to engulfment, and allow us to focus the precious light precisely when and where it is needed to understand how engulfment and killing occur. This spatial and temporally selective high-resolution imaging will then be combined with novel camera technology to ensure that each pixel in the image is captured at the optimal signal-to-noise. This will allow users to ensure that the maximal information is obtained from each experiment. Combined, these technologies will dramatically improve our ability to observe a host of important biological events, by overcoming the main limitation of current microscopy.
Technical Summary
Using a programmable array microscope as the base of an imaging platform, we will train a two stage neural network to image a sample with a variety of modalities. Choosing each modality by weighing the resolution required at that point in time against the increased number of photons needed to achieve that resolution. The low damage low resolution mode will be based on phase imaging microscopy, with confocal and structured illumination available for the higher resolution modes. The microscope will deploy different modalities across a single field of view, so that amoeba, the proof of concept sample, will only be imaged if it is undergoing phagocytosis, other amoeba not undergoing phagocytosis will continue to be imaged in low resolution. The microscope will be coupled with a non-destructive readout camera, which can be used to produce event based images, where each pixel can have its own frame rate. Doing this will produce images in which each pixel has the highest signal to noise ratio. The final outcome of this work will be a low resolution image stream of amoeba which have peppered in high resolution areas where the sample under investigation is undergoing phagocytosis.
Publications
Kay RR
(2024)
Making cups and rings: the 'stalled-wave' model for macropinocytosis.
in Biochemical Society transactions
| Description | We have developed an imaging system which can, through the use of AI, change how it images. This means that we can look for rare events in biology, without damaging the biological system, but collect high resolution data when a specific and rare event occurs. This is useful in studying a wide range of biological events that are important to understanding health. |
| Exploitation Route | This work could be deployed on any microscopy system to reduce photo-damage while imaging and therefore allow for longer imaging times with higher optical resolutions. |
| Sectors | Healthcare Manufacturing including Industrial Biotechology |
| Title | Machine-learning system for real-time morphology analysis of amoebas |
| Description | This is a machine learning work path which allow research to automate the process of cell segmentation. The key achievement is that this system can run in real time and the health of the cells can be estimated in real time. |
| Type Of Material | Technology assay or reagent |
| Year Produced | 2024 |
| Provided To Others? | Yes |
| Impact | The code is being used in a number of publications currently in preparation and is being activly used by several other research groups. Once the publication are accepted, more groups will use the pathway. |
| URL | https://github.com/CadbyLab |
| Title | Cell tracking |
| Description | The cell tracking software can track in real time host cells sensitive to photo damage. It can estimate the health of the cell and report it back. It can also be used to trigger a change in imaging modality in response to a specific trigger within the cell. This could be a morphological change or behavioural change. |
| Type Of Technology | Software |
| Year Produced | 2024 |
| Open Source License? | Yes |
| Impact | This has made the process of tracking and measuring morphology in light sensitive cells much easier and in an automated way. This has found application in a number of other research groups. |
| URL | https://github.com/CadbyLab/Amoeba-tracking |
