TASCC: Pervasive low-TeraHz and Video Sensing for Car Autonomy and Driver Assistance (PATH CAD)

Lead Research Organisation: Heriot-Watt University
Department Name: Sch of Engineering and Physical Science

Abstract

This project combines novel low-THz (LTHz) sensor development with advanced video analysis, fusion and cross learning. Using the two streams integrated within the sensing, information and control systems of a modern automobile, we aim to map terrain and identify hazards such as potholes and surface texture changes in all weathers, and to detect and classify other road users (pedestrians, car, cyclists etc.).

The coming era of autonomous and assisted driving necessitates new all-weather technology. Advanced concepts of interaction between the sensed and processed data, the control systems and the driver can lead to autonomy in decision and control, securing all the needed information for the driver to intervene in critical situations. The aims are to improve road safety through increased situational awareness, and increase energy efficiency by reducing the emission of pollutants caused by poor control and resource use in both on and off-road vehicles.

Video cameras remain at the heart of our system: there are many reasons for this: low cost, availability, high resolution, a large legacy of processing algorithms to interpret the data and driver/passenger familiarity with the output. However it is widely recognized that video and/or other optical sensors such as LIDAR (c.f. Google car) are not sufficient. The same conditions that challenge human drivers such as heavy rain, fog, spray, snow and dust limit the capability of electro-optical sensors. We require a new approach.

The key second sensor modality is a low-THz radar system operating within the 0.3-1 THz frequency spectrum. By its very nature radar is robust to the conditions that limit video. However it is the relatively short wavelength and wide bandwidth of this LTHz radar with respect to existing automotive radar systems that can bring key additional capabilities. This radar has the potential to provide: (i) imagery that is closer to familiar video than those provided by a conventional radar, and hence can begin to exploit the vast legacy of image processing algorithms; (ii) significantly improved across-road image resolution leading to correspondingly significant improvements in vehicle, pedestrian and other 'actor' (cyclists, animals etc.) detection and classification; (iii) 3D images that can highlight objects and act as an input to the guidance and control system; (iv) analysis of the radar image features, such as shadows and image texture that will contribute to both classification and control.

The project is a collaboration between three academic institutions - the University of Birmingham with its long standing excellence in automotive radar research and radar technologies, the University of Edinburgh with world class expertise in signal processing and radar imaging and Heriot-Watt University with equivalent skill in video analytics, LiDAR and accelerated algorithms. The novel approach will be based on a fusion of video and radar images in a cross-learning cognitive process to improve the reliability and quality of information acquired by an external sensing system operating in all-weather, all-terrain road conditions without dependency on navigation assisting systems.

Publications

10 25 50
 
Description In this work, three academic institutions - the University of Birmingham with its long standing excellence in automotive radar research and radar technologies, the University of Edinburgh with world class expertise in signal processing and radar imaging and Heriot-Watt University with equivalent skill in video analytics, LiDAR and accelerated algorithms are carrying out fundamental and applied research on novel multifunction multimodal sensing technology by combining, for the first time, low-THz radar and video imagery for a new generation of automotive sensors intended for both driver assistance and autonomous driving. The approach will be based on a fusion of video and radar images in a cross-learning cognitive process to improve the reliability and quality of information acquired by an external sensing system operating in all-weather, all-terrain road conditions without dependency on navigation assisting systems.

First, we resolved many issues associated with calibration and registration using these different sensor modes. We also conducted (and are continuing) several trials using radar, video and LiDAR data to give ourselves datasets for our work in both scene mapping and recognition. We have evaluated a number of current techniques for actor (car, pedestrian etc.) detection and classification from images. We evaluated results on our tracking and optical flow software on video sequences. These included simple application of the traditional Kalman filter, through to more complex multi-target tracking algorithms using particle filters and Random Finite Set methods.

At Heriot-Watt we have concentrated on three principal areas.

Image Fusion: radar, LiDAR and optical stereo sensing have complementary strengths and weaknesses. For example, radar has low resolution but can see very well in adverse conditions, such as fog, snow and mist. On the other hand, LiDAR has very high resolution, but as it uses laser light it is severely degraded by adverse weather. The fusion work tries to combine information from all sensors and adapt accordingly the image degrades. So far we have discovered that Random Finite Set algorithms can be applied successfully to update other vehicle positions using data from all sensors adaptively and iteratively, but this does not really work for dense scene mapping. Therefore we have turned recently to dense regularization techniques that combine 'point clouds' from stereo, LiDAR and radar to produce dense scene maps.

Object Recognition: Deep Convolutional Neural Networks (DCNNs) have been successfully applied to large scale natural image data (although there are some caveats about their use). In this project we have used DCNNs to recognize objects in low THz radar imagery with mixed success. Success rates are very high in laboratory environment, and we have also been able to generalize over different ranges, receivers and viewing angles using data augmentation and transfer learning. however further work is required to extend these techniques into the real, road environment.

Behaviour prediction: for autonomous cars to function, one might expect them to be able to predict the future behavior of other vehicles in the road network. In the last year we have started to develop prediction strategies based on sensor data, using knowledge of highway rules, statistical and worst case trajectories for other vehicles, in order to steer a safe path.
Exploitation Route Driver assistance or full vehicle autonomy depends on fast and accurate interpretation of sensor data. If successful, the project outcomes should be taken forward by our partners, Jaguar Land Rover, in developing premium cars. However, results should also be more widely available through publication and other dissemination.
Sectors Aerospace, Defence and Marine,Environment,Transport

URL https://sites.google.com/a/jaguarlandrover.com/tascc
 
Description Presentation to EPSRC Theme Meeting on Robotics and Autonomous Systems (January)
Geographic Reach Europe 
Policy Influence Type Gave evidence to a government review
URL https://www.epsrc.ac.uk/newsevents/pubs/epsrc-delivery-plan-2016-17-2019-20/
 
Title Volkswagen Transporter Mobile Laboratory 
Description We have equipped an existing, Robotarium VW Transporter van owned by Heriot-Watt University with a Navtech radar, a TI radar, a ZED Stereo camera, a Velodyne LiDAR, GPS and IMU equipment. This enables us to collect fully synchronized and time-stamped sensor data over long periods (several hours) when driving the van on normal urban, rural road, dual carriageways and motorways. We are using this facility to create substantial datasets which we can use for our work on multi-modal fusion, object recognition and behaviour prediction from a mobile platform. To date, it has only been used within Heriot-Watt University by the TASCC project, but it is available to the collaborating institutions on TASCC and to other researchers at HWU, specifically in the Robotics CDT (EPSRC). As it has only been available since December 2018, we have collected considerable data but do not yet have substantial processing results. 
Type Of Material Improvements to research infrastructure 
Year Produced 2019 
Provided To Others? No  
Impact As it has only been available since December 2018, we have collected considerable data but do not yet have substantial processing results. 
 
Description Deep Dive 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Professional Practitioners
Results and Impact This consisted of a series of oral presentations and poster displays to an audience consisting of Jaguar Land Rover personnel and other university researchers on the TASCC programme. The intention was to go into considerable technical detail about the several topics covered, including low THz radar design, Doppler beam sharpening, height imaging using radar, radar-optical fusion, recognition of actors in radar and other image data. Of these activities, personnel from HWU gave talks on fusion and object recognition.
Year(s) Of Engagement Activity 2018
 
Description Sharefair Exhibition at Motor Museum 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Industry/Business
Results and Impact Approximately 150 participants from Jaguar Land Rover (JLR), a number of UK universities and a few media personnel attended an event to showcase the outcomes of the EPSRC/TASCC programme. The main purpose was to showcase the research outcomes as part of a technology transfer activity as part of the joint EPSRC/JLR programme. Several members of JLR staff expresssed interest in our work, and as a result have agreed a number of other collaboartive activities, including investigation of flood sensing for cars, and methods of approximate computing to save power in future electric cars.
Year(s) Of Engagement Activity 2017
 
Description Visual Saliency and Autonomous Vehicles Workshop 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Professional Practitioners
Results and Impact A workshop was held in March 2016 as part of the EPSRC Network for Biological and Computer Vision. The title of the workshop was Vision in Human and Machine. I gave a talk on Visual Saliency and Autonomous Vehicles.
Year(s) Of Engagement Activity 2016
URL http://www.viihm.org.uk