TASCC: Pervasive low-TeraHz and Video Sensing for Car Autonomy and Driver Assistance (PATH CAD)

Lead Research Organisation: Heriot-Watt University
Department Name: Sch of Engineering and Physical Science

Abstract

This project combines novel low-THz (LTHz) sensor development with advanced video analysis, fusion and cross learning. Using the two streams integrated within the sensing, information and control systems of a modern automobile, we aim to map terrain and identify hazards such as potholes and surface texture changes in all weathers, and to detect and classify other road users (pedestrians, car, cyclists etc.).

The coming era of autonomous and assisted driving necessitates new all-weather technology. Advanced concepts of interaction between the sensed and processed data, the control systems and the driver can lead to autonomy in decision and control, securing all the needed information for the driver to intervene in critical situations. The aims are to improve road safety through increased situational awareness, and increase energy efficiency by reducing the emission of pollutants caused by poor control and resource use in both on and off-road vehicles.

Video cameras remain at the heart of our system: there are many reasons for this: low cost, availability, high resolution, a large legacy of processing algorithms to interpret the data and driver/passenger familiarity with the output. However it is widely recognized that video and/or other optical sensors such as LIDAR (c.f. Google car) are not sufficient. The same conditions that challenge human drivers such as heavy rain, fog, spray, snow and dust limit the capability of electro-optical sensors. We require a new approach.

The key second sensor modality is a low-THz radar system operating within the 0.3-1 THz frequency spectrum. By its very nature radar is robust to the conditions that limit video. However it is the relatively short wavelength and wide bandwidth of this LTHz radar with respect to existing automotive radar systems that can bring key additional capabilities. This radar has the potential to provide: (i) imagery that is closer to familiar video than those provided by a conventional radar, and hence can begin to exploit the vast legacy of image processing algorithms; (ii) significantly improved across-road image resolution leading to correspondingly significant improvements in vehicle, pedestrian and other 'actor' (cyclists, animals etc.) detection and classification; (iii) 3D images that can highlight objects and act as an input to the guidance and control system; (iv) analysis of the radar image features, such as shadows and image texture that will contribute to both classification and control.

The project is a collaboration between three academic institutions - the University of Birmingham with its long standing excellence in automotive radar research and radar technologies, the University of Edinburgh with world class expertise in signal processing and radar imaging and Heriot-Watt University with equivalent skill in video analytics, LiDAR and accelerated algorithms. The novel approach will be based on a fusion of video and radar images in a cross-learning cognitive process to improve the reliability and quality of information acquired by an external sensing system operating in all-weather, all-terrain road conditions without dependency on navigation assisting systems.

Publications

10 25 50
 
Description In this work, three academic institutions - the University of Birmingham with its long standing excellence in automotive radar research and radar technologies, the University of Edinburgh with world class expertise in signal processing and radar imaging and Heriot-Watt University with equivalent skill in video analytics, LiDAR and accelerated algorithms are carrying out fundamental and applied research on novel multifunction multimodal sensing technology by combining, for the first time, low-THz radar and video imagery for a new generation of automotive sensors intended for both driver assistance and autonomous driving. The approach will be based on a fusion of video and radar images in a cross-learning cognitive process to improve the reliability and quality of information acquired by an external sensing system operating in all-weather, all-terrain road conditions without dependency on navigation assisting systems.

The project is at an early stage. To date, the key work at Heriot-Watt University has been in video analytics, that is the processing and interpretation of video data for the purposes of fusion, scene mapping and actor recognition. First, we have resolved many issues associated with calibration and registration using these different sensor modes. We have also conducted (and are continuing) several trials using radar, video and LiDAR data to give ourselves a dataset for future work in both scene mapping and recognition. Second, we have evaluated a number of current techniques for actor (car, pedestrian etc.) detection from images. Deep Neural Networks (DNNS) have shown very promising results in applications where large, labelled datasets are available, and could potentially be used for video streams at least. However, there is little work on their application to forward looking RADAR data and it remains to be seen whether these are effective in our context. We have also evaluated examples of object recognition using "hand-crafted" feature sets. Finally, we have evaluated results on our tracking and optical flow software on these video sequences. These include simple application of the traditional Kalman filter, through to more complex multi-target tracking algorithms using particle filters and Random Finite Set methods.
Exploitation Route Driver assistance or full vehicle autonomy depends on fast and accurate interpretation of sensor data. If successful, the project outcomes should be taken forward by our partners, Jaguar Land Rover, in developing premium cars. However, results should also be more widely available through publication and other dissemination.
Sectors Aerospace, Defence and Marine,Environment,Transport

URL https://portal.axillium.com/TASCC/programme
 
Description Presentation to EPSRC Theme Meeting on Robotics and Autonomous Systems (January)
Geographic Reach Europe 
Policy Influence Type Gave evidence to a government review
URL https://www.epsrc.ac.uk/newsevents/pubs/epsrc-delivery-plan-2016-17-2019-20/
 
Description Sharefair Exhibition at Motor Museum 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Industry/Business
Results and Impact Approximately 150 participants from Jaguar Land Rover (JLR), a number of UK universities and a few media personnel attended an event to showcase the outcomes of the EPSRC/TASCC programme. The main purpose was to showcase the research outcomes as part of a technology transfer activity as part of the joint EPSRC/JLR programme. Several members of JLR staff expresssed interest in our work, and as a result have agreed a number of other collaboartive activities, including investigation of flood sensing for cars, and methods of approximate computing to save power in future electric cars.
Year(s) Of Engagement Activity 2017
 
Description Visual Saliency and Autonomous Vehicles Workshop 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Professional Practitioners
Results and Impact A workshop was held in March 2016 as part of the EPSRC Network for Biological and Computer Vision. The title of the workshop was Vision in Human and Machine. I gave a talk on Visual Saliency and Autonomous Vehicles.
Year(s) Of Engagement Activity 2016
URL http://www.viihm.org.uk