TASCC: Pervasive low-TeraHz and Video Sensing for Car Autonomy and Driver Assistance (PATH CAD)

Lead Research Organisation: Heriot-Watt University
Department Name: Sch of Engineering and Physical Science

Abstract

This project combines novel low-THz (LTHz) sensor development with advanced video analysis, fusion and cross learning. Using the two streams integrated within the sensing, information and control systems of a modern automobile, we aim to map terrain and identify hazards such as potholes and surface texture changes in all weathers, and to detect and classify other road users (pedestrians, car, cyclists etc.).

The coming era of autonomous and assisted driving necessitates new all-weather technology. Advanced concepts of interaction between the sensed and processed data, the control systems and the driver can lead to autonomy in decision and control, securing all the needed information for the driver to intervene in critical situations. The aims are to improve road safety through increased situational awareness, and increase energy efficiency by reducing the emission of pollutants caused by poor control and resource use in both on and off-road vehicles.

Video cameras remain at the heart of our system: there are many reasons for this: low cost, availability, high resolution, a large legacy of processing algorithms to interpret the data and driver/passenger familiarity with the output. However it is widely recognized that video and/or other optical sensors such as LIDAR (c.f. Google car) are not sufficient. The same conditions that challenge human drivers such as heavy rain, fog, spray, snow and dust limit the capability of electro-optical sensors. We require a new approach.

The key second sensor modality is a low-THz radar system operating within the 0.3-1 THz frequency spectrum. By its very nature radar is robust to the conditions that limit video. However it is the relatively short wavelength and wide bandwidth of this LTHz radar with respect to existing automotive radar systems that can bring key additional capabilities. This radar has the potential to provide: (i) imagery that is closer to familiar video than those provided by a conventional radar, and hence can begin to exploit the vast legacy of image processing algorithms; (ii) significantly improved across-road image resolution leading to correspondingly significant improvements in vehicle, pedestrian and other 'actor' (cyclists, animals etc.) detection and classification; (iii) 3D images that can highlight objects and act as an input to the guidance and control system; (iv) analysis of the radar image features, such as shadows and image texture that will contribute to both classification and control.

The project is a collaboration between three academic institutions - the University of Birmingham with its long standing excellence in automotive radar research and radar technologies, the University of Edinburgh with world class expertise in signal processing and radar imaging and Heriot-Watt University with equivalent skill in video analytics, LiDAR and accelerated algorithms. The novel approach will be based on a fusion of video and radar images in a cross-learning cognitive process to improve the reliability and quality of information acquired by an external sensing system operating in all-weather, all-terrain road conditions without dependency on navigation assisting systems.
 
Description In this work, three academic institutions - the University of Birmingham with its long standing excellence in automotive radar research and radar technologies, the University of Edinburgh with world class expertise in signal processing and radar imaging and Heriot-Watt University with equivalent skill in video analytics, LiDAR and accelerated algorithms are carrying out fundamental and applied research on novel multifunction multimodal sensing technology by combining, for the first time, low-THz radar and video imagery for a new generation of automotive sensors intended for both driver assistance and autonomous driving. The approach is based on a fusion of video and radar images in a cross-learning cognitive process to improve the reliability and quality of information acquired by an external sensing system operating in all-weather, all-terrain road conditions without dependency on navigation assisting systems.

First, we resolved many issues associated with calibration and registration using these different sensor modes. We also conducted (and are continuing) several trials using radar, video and LiDAR data to give ourselves datasets for our work in both scene mapping and recognition. We have evaluated a number of current techniques for actor (car, pedestrian etc.) detection and classification from images. We evaluated results on our tracking and optical flow software on video sequences. These included simple application of the traditional Kalman filter, through to more complex multi-target tracking algorithms using particle filters and Random Finite Set methods.

At Heriot-Watt we have concentrated on three principal areas.

Image Fusion: radar, LiDAR and optical stereo sensing have complementary strengths and weaknesses. For example, radar has low resolution but can see very well in adverse conditions, such as fog, snow and mist. On the other hand, LiDAR has very high resolution, but as it uses laser light it is severely degraded by adverse weather. The fusion work tries to combine information from all sensors and adapt accordingly the image degrades. So far we have discovered that Random Finite Set algorithms can be applied successfully to update other vehicle positions using data from all sensors adaptively and iteratively, but this does not really work for dense scene mapping. Therefore we have turned recently to dense regularization techniques that combine 'point clouds' from stereo, LiDAR and radar to produce dense scene maps.

Object Recognition: Deep Convolutional Neural Networks (DCNNs) have been successfully applied to large scale natural image data (although there are some caveats about their use). In this project we have used DCNNs to recognize objects in low THz radar imagery with mixed success. Success rates are very high in laboratory environment, and we have also been able to generalize over different ranges, receivers and viewing angles using data augmentation and transfer learning. however further work is required to extend these techniques into the real, road environment.

Behaviour prediction: for autonomous cars to function, one might expect them to be able to predict the future behavior of other vehicles in the road network. In the last year we have started to develop prediction strategies based on sensor data, using knowledge of highway rules, statistical and worst case trajectories for other vehicles, in order to steer a safe path.
Exploitation Route Driver assistance or full vehicle autonomy depends on fast and accurate interpretation of sensor data. If successful, the project outcomes should be taken forward by our partners, Jaguar Land Rover, in developing premium cars. However, results should also be more widely available through publication and other dissemination.

To date, we have made significant progress on

Vehicle detection and classification from radar images,
Vehicle behaviour prediction, based on their detection and classification.
Data fusion, using radar, LiDAR and stereo sensors.

These outcomes are valuable for future work on driver assistance and vehicle autonomy in bad weather.
Several papers have been published and more are in progress.

As this was jointly funded by Jaguar Land Rover, they have had access to all our work.

The work has also been linked to work with ST Microelectronics on fast scanning LiDAR for automotive applications, and has also resulted in a new automotive dataset, Radiate.
Sectors Environment,Transport

URL https://sites.google.com/a/jaguarlandrover.com/tascc
 
Description In addition to published work, extended deliverables have been made available to Jaguar Land Rover.
First Year Of Impact 2020
Sector Transport
 
Description Presentation to EPSRC Theme Meeting on Robotics and Autonomous Systems (January)
Geographic Reach Europe 
Policy Influence Type Gave evidence to a government review
URL https://www.epsrc.ac.uk/newsevents/pubs/epsrc-delivery-plan-2016-17-2019-20/
 
Title Volkswagen Transporter Mobile Laboratory 
Description We have equipped an existing, Robotarium VW Transporter van owned by Heriot-Watt University with a Navtech radar, a TI radar, a ZED Stereo camera, a Velodyne LiDAR, GPS and IMU equipment. This enables us to collect fully synchronized and time-stamped sensor data over long periods (several hours) when driving the van on normal urban, rural road, dual carriageways and motorways. We are using this facility to create substantial datasets which we can use for our work on multi-modal fusion, object recognition and behaviour prediction from a mobile platform. To date, it has only been used within Heriot-Watt University by the TASCC project, but it is available to the collaborating institutions on TASCC and to other researchers at HWU, specifically in the Robotics CDT (EPSRC). As it has only been available since December 2018, we have collected considerable data but do not yet have substantial processing results. 
Type Of Material Improvements to research infrastructure 
Year Produced 2019 
Provided To Others? No  
Impact As it has only been available since December 2018, we have collected considerable data but do not yet have substantial processing results. 
 
Title RADIATE: A Radar Dataset for Automotive Perception. 
Description Datasets for autonomous cars are essential for the development and benchmarking of perception systems. However, most existing datasets are captured with camera and LiDAR sensors in good weather conditions. In this paper, we present the RAdar Dataset In Adverse weaThEr (RADIATE), aiming to facilitate research on object detection, tracking and scene understanding using radar sensing for safe autonomous driving. RADIATE includes 3 hours of annotated radar images with more than 200K labelled road actors in total, on average about 4.6 instances per radar image. It covers 8 different categories of actors in a variety of weather conditions (e.g., sun, night, rain, fog and snow) and driving scenarios (e.g., parked, urban, motorway and suburban), representing different levels of challenge. To the best of our knowledge, this is the first public radar dataset which provides high-resolution radar images on public roads with a large amount of road actors labelled. The data collected in adverse weather, e.g., fog and snowfall, is unique. Some baseline results of radar based object detection and recognition are given to show that the use of radar data is promising for automotive applications in bad weather, where vision and LiDAR can fail. RADIATE also has stereo images, 32-channel LiDAR and GPS data, directed at other applications such as sensor fusion, localisation and mapping. The public dataset can be accessed at this http URL. 
Type Of Material Data handling & control 
Year Produced 2020 
Provided To Others? Yes  
Impact We have used this dataset for research, and are preparing papers fro publication on radar-LiDAR fusion and automotive behaviour prediction from the radar data. The dataset has been downloaded in excess of 100 times. 
URL https://pro.hw.ac.uk/radiate/
 
Description Development of Algorithms for Optical/Radar Data Fusion and Radar Image Interpretation for autonomous vehicles 
Organisation Jaguar Land Rover Automotive PLC
Department Jaguar Land Rover
Country United Kingdom 
Sector Private 
PI Contribution Autonomous cars or driver assistance systems require robust sensing to cope with adverse weather conditions such as fog or heavy rain or snow. We have been developing algorithms and systems to snese in bad weather, using automotive radar, and to combine radar and optical image interpretation algorithms by sensor fusion. We have equipped a van with stereo camera, LiDAR and 79GHz radar and have collected extensive data for our research on scene mapping, actor recognition and behaviour prediction for traffic participants.
Collaborator Contribution Birmingham University have been building novel 300GHz automotive radar systems. Edinburgh University have been developing new ways to acquire radar data at high resolution. Jaguar Land Rover have provided a test vehicle for our work.
Impact A number of publications have arisen, with at least one more in preparation. These are listed on the ResearchFish website.
Start Year 2015
 
Description Development of Algorithms for Optical/Radar Data Fusion and Radar Image Interpretation for autonomous vehicles 
Organisation University of Birmingham
Country United Kingdom 
Sector Academic/University 
PI Contribution Autonomous cars or driver assistance systems require robust sensing to cope with adverse weather conditions such as fog or heavy rain or snow. We have been developing algorithms and systems to snese in bad weather, using automotive radar, and to combine radar and optical image interpretation algorithms by sensor fusion. We have equipped a van with stereo camera, LiDAR and 79GHz radar and have collected extensive data for our research on scene mapping, actor recognition and behaviour prediction for traffic participants.
Collaborator Contribution Birmingham University have been building novel 300GHz automotive radar systems. Edinburgh University have been developing new ways to acquire radar data at high resolution. Jaguar Land Rover have provided a test vehicle for our work.
Impact A number of publications have arisen, with at least one more in preparation. These are listed on the ResearchFish website.
Start Year 2015
 
Description Development of Algorithms for Optical/Radar Data Fusion and Radar Image Interpretation for autonomous vehicles 
Organisation University of Edinburgh
Country United Kingdom 
Sector Academic/University 
PI Contribution Autonomous cars or driver assistance systems require robust sensing to cope with adverse weather conditions such as fog or heavy rain or snow. We have been developing algorithms and systems to snese in bad weather, using automotive radar, and to combine radar and optical image interpretation algorithms by sensor fusion. We have equipped a van with stereo camera, LiDAR and 79GHz radar and have collected extensive data for our research on scene mapping, actor recognition and behaviour prediction for traffic participants.
Collaborator Contribution Birmingham University have been building novel 300GHz automotive radar systems. Edinburgh University have been developing new ways to acquire radar data at high resolution. Jaguar Land Rover have provided a test vehicle for our work.
Impact A number of publications have arisen, with at least one more in preparation. These are listed on the ResearchFish website.
Start Year 2015
 
Description Deep Dive 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Professional Practitioners
Results and Impact This consisted of a series of oral presentations and poster displays to an audience consisting of Jaguar Land Rover personnel and other university researchers on the TASCC programme. The intention was to go into considerable technical detail about the several topics covered, including low THz radar design, Doppler beam sharpening, height imaging using radar, radar-optical fusion, recognition of actors in radar and other image data. Of these activities, personnel from HWU gave talks on fusion and object recognition.
Year(s) Of Engagement Activity 2018
 
Description Sharefair Exhibition at Motor Museum 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Industry/Business
Results and Impact Approximately 150 participants from Jaguar Land Rover (JLR), a number of UK universities and a few media personnel attended an event to showcase the outcomes of the EPSRC/TASCC programme. The main purpose was to showcase the research outcomes as part of a technology transfer activity as part of the joint EPSRC/JLR programme. Several members of JLR staff expresssed interest in our work, and as a result have agreed a number of other collaboartive activities, including investigation of flood sensing for cars, and methods of approximate computing to save power in future electric cars.
Year(s) Of Engagement Activity 2017
 
Description Visual Saliency and Autonomous Vehicles Workshop 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Professional Practitioners
Results and Impact A workshop was held in March 2016 as part of the EPSRC Network for Biological and Computer Vision. The title of the workshop was Vision in Human and Machine. I gave a talk on Visual Saliency and Autonomous Vehicles.
Year(s) Of Engagement Activity 2016
URL http://www.viihm.org.uk