COgnitive REal time SENsing SystEm for autonomous vehicles- Cognitive real time sensing system for autonomous vehicles - CORTEX
Lead Participant:
JAGUAR LAND ROVER LIMITED
Abstract
In 2016 many people were predicting fully autonomous cars within 5 years. After initial quick progress, their recent development has slowed significantly. The market is now adjusting to the new, slower rate
of development. This has been seen in more measured predictions from manufacturers and experts, with some companies rolling back on autonomy plans and projects. Whilst machine learning and computer vision has enabled vehicles to reach near autonomy in controlled conditions, it is real world driving in all weather conditions that is proving difficult to achieve.
Although there have been demonstrations of autonomy using a combination of fully mapped environments and data from large numbers of sensors, the environments in which the demonstration vehicles operate are highly constrained and therefore, do not represent some of the key environmental challenges that will be encountered by these vehicles in real-life driving scenarios.
Several challenges have been highlighted in this CCAV call to facilitate full autonomy, however challenges on "autonomous vehicle features that provide real-world benefits to users to work as part of a wider transport system" cannot succeed if the system cannot robustly understand the environment in which the vehicle will drive, i.e. different weather conditions (sunny, dark, raining, foggy or snowing), or different road surfaces across the world. The project addresses the development of key enabling technologies and sensing techniques for autonomous driving in all-road and all-weather driving conditions.
Radar systems remain relatively unaffected by adverse environmental conditions (dirt, rain, ice, snow, fog etc…), However, the radar does not have the same level of resolution as a camera. Often is not possible to classify an object based solely on its radar return signature. This is changing as manufacturers add more and more transmit and receive elements to their radar boards. This is improving the accuracy and resolution to create an 'Imaging Radar'. Cortex will use advanced radar processing and beamforming techniques to improve resolution and reduce the known issues e.g. MIMO high side lobes. The objects in the resulting radar images will then be classified and the complete scene will be segmented into different classes. The resulting identification and labeling of objects will not be affected by weather, in the same way a camera would.
Autonomous driving based on machine learning, using CNN algorithms from the camera images is common. There is however a new class of 'Transformer' algorithm, previously applied to speech and simple waveform data, that is now being applied to images. This project will explore the Transformer Algorithms and their use in object identification and image segmentation.
The project will take real world driving data from on road and off road and compare the classification from radar and camera images.
of development. This has been seen in more measured predictions from manufacturers and experts, with some companies rolling back on autonomy plans and projects. Whilst machine learning and computer vision has enabled vehicles to reach near autonomy in controlled conditions, it is real world driving in all weather conditions that is proving difficult to achieve.
Although there have been demonstrations of autonomy using a combination of fully mapped environments and data from large numbers of sensors, the environments in which the demonstration vehicles operate are highly constrained and therefore, do not represent some of the key environmental challenges that will be encountered by these vehicles in real-life driving scenarios.
Several challenges have been highlighted in this CCAV call to facilitate full autonomy, however challenges on "autonomous vehicle features that provide real-world benefits to users to work as part of a wider transport system" cannot succeed if the system cannot robustly understand the environment in which the vehicle will drive, i.e. different weather conditions (sunny, dark, raining, foggy or snowing), or different road surfaces across the world. The project addresses the development of key enabling technologies and sensing techniques for autonomous driving in all-road and all-weather driving conditions.
Radar systems remain relatively unaffected by adverse environmental conditions (dirt, rain, ice, snow, fog etc…), However, the radar does not have the same level of resolution as a camera. Often is not possible to classify an object based solely on its radar return signature. This is changing as manufacturers add more and more transmit and receive elements to their radar boards. This is improving the accuracy and resolution to create an 'Imaging Radar'. Cortex will use advanced radar processing and beamforming techniques to improve resolution and reduce the known issues e.g. MIMO high side lobes. The objects in the resulting radar images will then be classified and the complete scene will be segmented into different classes. The resulting identification and labeling of objects will not be affected by weather, in the same way a camera would.
Autonomous driving based on machine learning, using CNN algorithms from the camera images is common. There is however a new class of 'Transformer' algorithm, previously applied to speech and simple waveform data, that is now being applied to images. This project will explore the Transformer Algorithms and their use in object identification and image segmentation.
The project will take real world driving data from on road and off road and compare the classification from radar and camera images.
Lead Participant | Project Cost | Grant Offer |
---|---|---|
JAGUAR LAND ROVER LIMITED | £428,790 | £ 214,395 |
  | ||
Participant |
||
WHITE HORSE RADAR LIMITED | ||
MYRTLE SOFTWARE LIMITED | £909,728 | £ 636,810 |
UNIVERSITY OF BIRMINGHAM | £1,152,530 | £ 1,152,530 |
INNOVATE UK |
People |
ORCID iD |
Stephen Skinner (Project Manager) |