Embedded vision systems - a platform for integrating modern imaging sensors and real-time image processing
Lead Research Organisation:
University of Manchester
Department Name: Electrical and Electronic Engineering
Abstract
Megapixel image sensors are commonplace in our modern society and are found in devices such as mobile phones, game consoles, still and video cameras. The fierce competition between sensor manufacturers that operate in this large market segment has driven down costs and at the same time enabled and accelerated the development of sensors with novel capabilities. This revolution in sensor technology has made very large arrays with wavelength limited pixel size available in consumer products. At the same time problematic issues such as low sensitivity and high noise levels have been addressed successfully to the point that the sensors become of interest to the more demanding scientific and industrial community for a very diverse range of applications. The resulting high signal to noise ratio and high frame rate capability of emerging sensors enable important imaging tasks that require high dynamic range (HDR). A further exciting development is that of adding a further dimension to the 2D imaging by measuring the 'colour' of the light (spectroscopic) incident on each pixel. A nondestructive readout of the sensor during exposure allows very large gains in the dynamic range of a sensor albeit at the expense of having to read out the sensor multiple times for a single HDR image. The combination of 3D, high resolution, high dynamic range and high frame rate is exacerbating an already serious issue for handling and transmitting data at very modest frame rates of 30 per second. Many existing professional cameras completely fill the bandwidth that the highest speed data transfer protocols offer. Bandwidth issues aside, the task of image processing remains: simply filling large memory buffers with data for offline processing doesn't suit many applications.To address this serious issue of data avalanche in our EPSRC funded work on transparent imaging and diagnostics of highly energetic particle beams, we took inspiration from nature. In human vision, for example, the information from the light sensitive cells does not immediately go to the brain, pixel by pixel, but already in the retina a certain amount of data is digested by combining the signals from several visual receptors. A sophisticated front-end image processing approach was the only way to measure the key high-resolution beam characteristics at high frame rates. After three generations of camera designs and 10 man years of effort, we have refined a novel camera architecture that combines both coarse and fine grain parallel image processing using a heterogeneous hierarchy of image processing elements. With these it becomes standard to only transmit those pixels that vary frame by frame. We have already implemented various dedicated realtime image processing tasks such as reducing data flow by only passing on that part of the image that changes between frames in pixel intensity and/or colour, HDR imaging, particle tracking and size measurement, colour interpolation, noise reduction, feature extraction, histogramming, and many others. The work we propose in this project will allow us to show the market what our systems are capable of and to provide selected groups that show an interest with the systems. We will demonstrate how our approach will revolutionize many sophisticated imaging techniques in much the same way as digital still cameras have transformed photography. We will turn our devices to 'plug and play' camera systems that can readily be used by others. This project brings together specialists from both academic and commercial background to prepare our vision systems technology for market and to investigate the best route to commercial exploitation of the created IP and know-how. Together the partners will implement two key applications: spectroscopic imaging and integrate the device as the brain and eyes of an unmanned vehicle initiative at Manchester. On completion of the project we will be in a strong position to secure support from venture capital or seed funds.
Planned Impact
We and our collaborators are convinced that the proposed innovation has significant commercial potential that would benefit the UK economy. There are several market segments for which this technology is ideal and would bring immediate economic benefit to its users. We will now highlight a couple of markets in detail and highlight how we think our development can improve key economic factors. The UK has a very strong presence on the camera and machine vision market supported by two dedicated associations (UKIVA, BMVA) and many dedicated computer vision research groups. Machine vision: one of the largest and growing camera markets, were cameras are used to oversee, control and inspect automatic production lines. This market segment relies on vision systems that can control machinery to manufacture products fast, 24/7 and without any supervision resulting in high productivity at a cost which is competitive with products from the far east. Currently systems are mostly composed of cameras in combination with PCs equipped that feature expensive image acquisition hardware. A typical production line will have several of these systems leading to many PC and camera systems scattered over the shop floor each consuming significant amounts of electrical power, space and above all - costs. Our innovation would be able to reduce both costs and power consumption significantly given users an economic advantage. Photography: with professional cameras now feature sensors with up to 50 Mpixels of resolution. Such high resolution allows the photographer to digitally zoom in and thereby reducing the amount of 'shots' to be taken during expensive photo sessions. Also images with high dynamic range are in high demand and typically a photographer would take many images with varying exposure times and post processing. Our technology could ease such procedures leading to cost savings for our users. Remote operated equipment: there are several areas of interest were it is undesirable or too expensive to have humans to monitor and control equipment. We give a couple of example where humans were needed but now could well be remotely operated using smart and instrumented cameras of the type described in our case for funding. These include, deep sea exploration, fission/fusion reactor servicing/control, space exploration, military reconnaissance and atmospheric research. Security: the safety of air travel is dependent on fast, unobtrusive ways of detecting weapons and explosive materials many of these systems rely on the detection of specific features in 2D or sometimes 3D images. Human/Machine Interface: a lot of research is carried out currently on novel ways of machine learning through gesture recognition. Research: the ability to extend microscopic measurements with extra dimensions such as 3D observation and spectroscopy pixel by pixel constitutes a significant advantage and many groups are working in this field. Each of these groups will benefit from a flexible platform such as proposed for acquisition and processing of the data as captured by these new instruments. 1. Peter Drucker, (1969). The Age of Discontinuity; Guidelines to Our Changing Society. Harper and Row, New York. ISBN 0-465-08984-4
People |
ORCID iD |
Roelof Van Silfhout (Principal Investigator) |
Publications
Kachatkou A
(2013)
On the resolution and linearity of lensless in situ X-ray beam diagnostics using pixelated sensors
in Optics Express
Kachatkou A
(2013)
In situ X-ray beam imaging using an off-axis magnifying coded aperture camera system.
in Journal of synchrotron radiation
Kachatkou A
(2014)
In situ micro-focused X-ray beam characterization with a lensless camera using a hybrid pixel detector.
in Journal of synchrotron radiation
Van Silfhout R
(2014)
Position and flux stabilization of X-ray beams produced by double-crystal monochromators for EXAFS scans at the titanium K-edge.
in Journal of synchrotron radiation
Van Silfhout R
(2023)
A Power Efficient Heterogeneous Embedded Vision System
in Journal of Signal Processing Systems
Description | Novel way of processing highly parallel data streams in realtime with a particular interest to image processing and understanding. |
Exploitation Route | Through a spin out company direct applications are being marketed. Further development potential is the subject of further research at Manchester. |
Sectors | Aerospace Defence and Marine Digital/Communication/Information Technologies (including Software) Electronics Healthcare Security and Diplomacy Transport |
Description | A spin out company was founded in 2014, PentaBee Ltd with the aim of commercialising the outcome of this grant. |
First Year Of Impact | 2014 |
Sector | Aerospace, Defence and Marine,Construction,Digital/Communication/Information Technologies (including Software),Electronics,Environment,Security and Diplomacy |
Impact Types | Economic |
Description | Diamond Light Source |
Organisation | Diamond Light Source |
Country | United Kingdom |
Sector | Private |
Start Year | 2008 |
Description | Unito |
Organisation | University of Turin |
Country | Italy |
Sector | Academic/University |
PI Contribution | Joint experiments at the ESRF |
Collaborator Contribution | Provision of samples |
Impact | Journal papers |
Start Year | 2012 |
Title | (Instrumentation company) Licence |
Description | Exclusive license agreement with instrumentation company supplying synchrotron radiation sources world wide with the developed XBI instrument |
IP Reference | |
Protection | Protection not required |
Year Protection Granted | |
Licensed | Yes |
Impact | Employment of staff at licensee, further funding to develop technology for health care purposes |