Event-Camera, Simultaneous Localisation and Mapping for Autonomous Systems and Mobile Devices

Lead Participant: SLAMCORE LIMITED

Abstract

Robotics and autonomous systems use sensors to understand where they are in the real world. The information from these sensors feed into a model which the system needs to create and locate itself within simultaneously. This challenge is known as Simultaneous Localisation and Mapping and is so fundamental that all truly autonomous products will require a dedicated SLAM system. To date, accurate real-time SLAM systems require expensive sensors combined with high power (both computational and battery) hardware. A novel sensor called the Event Camera was developed nearly a decade ago which has the potential to replace existing SLAM approaches but the underlying algorithms have proven very difficult to design due to the Event-Cameras fundamental architecture. It works like the human eye with each pixel only responding if it detects a change in intensity. This means the amount of information it generates is multiple orders of magnitude smaller but structured in a completely different way. The Slamcore team are pioneers of SLAM solutions and have recently demonstrated the first real-time, Event-Camera SLAM system. This project will take Slamcore’s technology and develop it further to demonstrate how Event-Camera SLAM will enable a revolution in autonomous products.

Lead Participant

Project Cost

Grant Offer

SLAMCORE LIMITED £304,595 £ 213,216
 

Participant

TUV SUD LIMITED
INNOVATE UK

People

ORCID iD

Publications

10 25 50