TASCC: Pervasive low-TeraHz and Video Sensing for Car Autonomy and Driver Assistance (PATH CAD)

Lead Research Organisation: University of Birmingham
Department Name: Electronic, Electrical and Computer Eng

Abstract

This project combines novel low-THz (LTHz) sensor development with advanced video analysis, fusion and cross learning. Using the two streams integrated within the sensing, information and control systems of a modern automobile, we aim to map terrain and identify hazards such as potholes and surface texture changes in all weathers, and to detect and classify other road users (pedestrians, car, cyclists etc.).

The coming era of autonomous and assisted driving necessitates new all-weather technology. Advanced concepts of interaction between the sensed and processed data, the control systems and the driver can lead to autonomy in decision and control, securing all the needed information for the driver to intervene in critical situations. The aims are to improve road safety through increased situational awareness, and increase energy efficiency by reducing the emission of pollutants caused by poor control and resource use in both on and off-road vehicles.

Video cameras remain at the heart of our system: there are many reasons for this: low cost, availability, high resolution, a large legacy of processing algorithms to interpret the data and driver/passenger familiarity with the output. However it is widely recognized that video and/or other optical sensors such as LIDAR (c.f. Google car) are not sufficient. The same conditions that challenge human drivers such as heavy rain, fog, spray, snow and dust limit the capability of electro-optical sensors. We require a new approach.

The key second sensor modality is a low-THz radar system operating within the 0.3-1 THz frequency spectrum. By its very nature radar is robust to the conditions that limit video. However it is the relatively short wavelength and wide bandwidth of this LTHz radar with respect to existing automotive radar systems that can bring key additional capabilities. This radar has the potential to provide: (i) imagery that is closer to familiar video than those provided by a conventional radar, and hence can begin to exploit the vast legacy of image processing algorithms; (ii) significantly improved across-road image resolution leading to correspondingly significant improvements in vehicle, pedestrian and other 'actor' (cyclists, animals etc.) detection and classification; (iii) 3D images that can highlight objects and act as an input to the guidance and control system; (iv) analysis of the radar image features, such as shadows and image texture that will contribute to both classification and control.

The project is a collaboration between three academic institutions - the University of Birmingham with its long standing excellence in automotive radar research and radar technologies, the University of Edinburgh with world class expertise in signal processing and radar imaging and Heriot-Watt University with equivalent skill in video analytics, LiDAR and accelerated algorithms. The novel approach will be based on a fusion of video and radar images in a cross-learning cognitive process to improve the reliability and quality of information acquired by an external sensing system operating in all-weather, all-terrain road conditions without dependency on navigation assisting systems.

Planned Impact

Within the European community, the total number of fatalities in road traffic accidents decreased by 61 % between 1990 and 2012 (source: Eurostat Yearbook 2015), a welcome statistic that has been driven by technology, including automatic braking and traction control systems, GPS, navigation and road hazard warning, and mechanical aids including the seat belt and air bag. However, 90% of accidents are still caused by human error, so the next step towards greater safety and autonomous capability is through dynamic extra-sensory perception. Of course, advanced prototype automobiles are equipped with radar, LiDAR and video sensors, each of which has complementary strengths, but their impact is always going to be limited because electro-optical sensors cannot function effectively in bad weather, and current radar systems operate only in 2 dimensions and do not give sufficient spatial resolution for situational awareness. The key impact of this proposal which marries novel THz sensor development to advanced video analysis, fusion and cross learning is to enable detection and classification of road users, detailed 3D structure and surface property resolution for dynamic situational awareness in all weather conditions, essential for future deployment because such systems must especially be deployed in unfavorable as well as fair conditions. The project has the potential to improve road safety in conventional automobiles, and also enable autonomous situational awareness for both on- and off-road vehicles. Thus there are many potential beneficiaries of this research.

A significant beneficiary of this research will, of course, be the industrial sponsor Jaguar Land Rover (JLR). The main benefit that will accrue inside the partnership of JLR and this consortium is to provide JLR with a new world-leading sensing system that will ultimately enhance safety, efficiency and road capacity. In order to enable the launch of a new generation of truly" autonomous" vehicles operating in all weathers and all terrains, the development of an innovative sensing system is required in a relatively compressed timeframe. The ready availability of equipment and expertise within JLR and this university consortium will enable rapid progress, within the timeframe and cost, and lay the foundation for a novel sensing system with extended capabilities that industry can exploit. Thus there is clear potential for exploitation by JLR itself. Additionally, there are opportunities for other companies such as SME's who are active in taking innovative sensor/signal processing concepts to market.

The other industrial beneficiaries of this research include the large UK defence sector with interests in radar and sensing. This includes companies such as BAE Systems, Selex ES, QinetiQ, Thales UK, Roke Manor Research as well as DSTL and the MoD itself. All weather and all terrain operation is particularly important in this sector. These companies actively exploit significant innovation in the civilian sector through the use of what they call commercial-off-the-shelf technology (COTS). This has the economy of scale that is hard to achieve in the defence sector. The sensing system invented in this project would provide a unique set of capabilities for both autonomous and non-autonomous military vehicles. Commercial exploitation in the civilian sector will lead to integrated circuits and LTHz components that can be exploited by the defence sector for sensing and vehicle control.
 
Description 1. Developed multi-static interferometric and polarimetric radars operating at 300 GHz and 79 GHz in collaboration with ELVA-1 under our supervision and by UoB specifications. 2. Developed low-THz radar calibration methodologies, these were required to produce a high revolution radar image quality.
3. Developed non-coherent radar target height reconstruction by Time Difference of Arrival Methods using tri-lateration and backprojection and developed a compressed sensing approach. These methods were validated with our developed multistatic radars operating at 79 and 300 GHz radar.
4. Developed Doppler Beam Sharpening approach to provided synthetic radar azimuth resolution refinement.
5. Developed synthetic aperture and compressed sensing radar image enhancement algorithms (lead by University of Edinburgh).
6. Equipped a test vehicle with multiple test radar and optical systems for imagery and propagation channel characterization.
7. Created a unique radar video database from test vehicle (79 GHz and 300 GHz), the first of this kind. Used as a testing kit and training data set for transfer and cross-learning and target classification by CNN and DNN. Data set is currently available to the consortium partners but after IP protection is lifted iit will be shared in the public domain through UoB repository.
8. Determined and experimentally confirmed limitations of optical (stereo camera and LIDAR - Velodyne) and radar sensor by the attenuation in media with obscurants. Created a "Fog chamber" and devised methods of characterization of fog density for characterization of attenuation.
9. Demonstrated landmark fusion of video and THz-imagery and dense fusion of lidar and video with Heriot-Watt University and University of Edinburgh.
10. Developed and applied D-CNN for classification of objects in radar imagery, including the application of transfer learning from other radar image data sets. Neither of which has never been attempted before for sub-THz wide band high resolution radar imagery.
11. Investigated and produced first theoretical and experimental output of novel manifold based cross learning techniques (lead by University of Edinburgh) between camera and radar sensors to enhance bad weather imaging
12. Developed a statistical radar image segmentation approach for surface ID classification for path planning using high resolution radar imagery.
Exploitation Route Publications of all the outcomes which were approved by funder's IP protection teams are available online in top-ranking open access journals and conference proceedings (or in the process of being written/finalised at this stage). A unique radar video database (79 GHz and 300 GHz), first of this kind, has been produced . Currently only available to the consortium partners but after IP protection it will be shared through repository and available for anyone to utilise the data further. A demonstration was developed using the THz/video/lidar sensor suit at ShareFair Event organized by JLR and EPSRC for the wider TASCC consortia and JLR ADAS teams. The results have already been used for further funding with JLR and SMEs - see "further funding" resulted in two Innovate UK awards.
Sectors Aerospace, Defence and Marine,Communities and Social Services/Policy,Digital/Communication/Information Technologies (including Software),Education,Electronics,Environment,Government, Democracy and Justice,Transport

 
Description In this project three leading universities - the University of Birmingham, The University of Edinburgh and the Heriot-Watt University have produce very significant research outcomes, first of all, research publications and novel techniques. Probably most notable achievements are the developed techniques on the use of sub-THz radar, Doppler Beam Sharpening and Forward Looking SAR to enable very high resolution imagery suitable for image segmentation and classification of objects within the scene. The project has addressed a wide scope of objectives, from analysis of performance of sub-THz radar and electro-optical sensors in adverse weather conditions, through development of cross-learning approach for video and radar data fusion and, finally, developing image segmentation to enable automatic path planning based solely on high resolution radar imagery. These works (through publications, developed techniques, methodologies and data collected, available through repositories) provided the team with the leading position in automotive high resolution sensing within the wider academic and industrial automotive radar community. There were several prestigious invitations for keynote talks, such as, for example, keynote talk at Automotive Forum at EUMW 2021 (Gashinova), which typically only have industrial presenters from leading primes, such as Bosch radar, Infineon, IMEC etc. There were also multiple invitations to present at focus sessions on sub-THz radar and to the date we are asked by TPC to organize workshops and focus sessions at top IRS, EUMW and IEEE Radar conferences. The latest are focussed session at IRS on THz radar and workshop proposal submitted to EUMW2023 on THz characterization. Additionally thatks to the established repitation in the area I am now regular track domain chair on automotive and civil applications of TPCs of EUMW and IEEE Radar conferences. We are now recognized as a leading non-industrial automotive research group in UK, specifically leading in automotive sub-THz radar development and imaging automotive radar based on principles of MIMO-SAR. As a result of PathCAD we have established trust and strong research relationships with JLR, HORIBA-MIRA, and a number of SMEs which led to two follow-on Innovate UK funded large scale projects (CoreTEX and Cosmos, both were completed in 2022), owing to our achievements in PathCAD project. We are in process of establishing strategic partnership with ZF automotive. Another EPSRC funded follow-on project on marine sensing right now benefits from expertise acquired in PathCAD, as well as models, techniques and bespoke equipment, developed within PATHCAD. Together with procured automotive radar toolkits these are used in an innovative work on high-resolution imagery for small agile marine platforms through use of mm-wave and sub-THz sensors and MIMO-DBS techniques, which advanced further the SAR processing developed in PathCAD. Based on established reputation in sub-THz sensing we have regular invitations to participate in bids (programme grants and CDTs) by other Universities. Several of our former PhD students and stuff associated with the project are now moved to industry (JLR, Airbus, Continental, QinetiQ). Thanks to EPSRC the expertise, experience accumulated through this project, allow us to advance scientific methods, create skills and establish competitive leadership in the areas of high resolution sub-THz sensing with immediate application in automotive, maritime and space domains.
First Year Of Impact 2019
Sector Aerospace, Defence and Marine,Digital/Communication/Information Technologies (including Software),Education,Electronics,Transport
Impact Types Societal

 
Description Invited to take part in NATO AVT-ET-194 Autonomous Mobility M&S Technical Committee
Geographic Reach Multiple continents/international 
Policy Influence Type Membership of a guideline committee
 
Description THz advisory board
Geographic Reach National 
Policy Influence Type Participation in a guidance/advisory committee
 
Description EPSRC strategic equipment - THz VNA and test facilities
Amount £1,143,093 (GBP)
Funding ID EP/P020615/1 
Organisation Engineering and Physical Sciences Research Council (EPSRC) 
Sector Public
Country United Kingdom
Start 09/2017 
End 08/2022
 
Description Innovate UK
Amount £2,700,000 (GBP)
Funding ID 104256 Cosmos 
Organisation Innovate UK 
Sector Public
Country United Kingdom
Start 01/2019 
End 12/2020
 
Description Innovate UK CCAV3
Amount £3,800,000 (GBP)
Funding ID Project number: 3348: Ref: TS/R017417/1 
Organisation Innovate UK 
Sector Public
Country United Kingdom
Start 07/2018 
End 01/2021
 
Description Multi-Dimensional ISAR Imagery From Space To Space
Amount £393,000 (GBP)
Funding ID DSTLX1000163770 
Organisation Ministry of Defence (MOD) 
Sector Public
Country United Kingdom
Start 10/2021 
End 04/2023
 
Title DBS for radar imaging for autonomous platforms 
Description To address azimuth refinement the algorithm was created and validated in laboratory conditions to provide imagery through synthetic azimuth refinement technique such as Doppler Beam sharpening.This has never been reported before for automotive imaging radar and currently paper is submitted to spacial issue IET RSN on Car Autonomy. 
Type Of Material Computer model/algorithm 
Year Produced 2018 
Provided To Others? No  
Impact This advance will define new mode of automotive radar operation to obtain high resolution imagery and justify bennefits of THz spectrum for automotive use due to extremely high improvement factor at such frequencies. 
 
Title Database of 150 GHz Radar images 
Description Radar images of road features which are key for enabling of autonomous driving in terms of understanding and feasibility of robust recognition 
Type Of Material Database/Collection of data 
Year Produced 2016 
Provided To Others? Yes  
Impact University of Edinburgh and Heriot-Watt University, UK 
 
Title Multi-sensor data acquisition for attenuation studies 
Description A database of low-THz radar, lidar and stereo camera imagery collected in adverse weather conditions. Specifically an experimental fog chamber has been developed for measurement. 
Type Of Material Database/Collection of data 
Year Produced 2017 
Provided To Others? No  
Impact For the first time experimental confirmation of effect of precipitations on THz imagery has been demonstrated and contrasted to that of electro-optical devices. Preliminary results published at IET Radar Conference in Belfast 
 
Title Radar videos (79 GHZ and 300 GHz) from the moving car 
Description We have collected the stream of radar frames (radar videos) at 79 GHz and 300 GHz when mounted on a car in real road conditions. 
Type Of Material Database/Collection of data 
Year Produced 2017 
Provided To Others? No  
Impact This will be invaluable tool for testing CNN and DNN on road imagery for road objects classification, similar to for example optical videos like KITTI. There is no similar reported data set and when IP protected it will become available to wide public. 
 
Title Repository of labelled objects in high resolution radar images 
Description Use of high Resolution radar proposed in this project led to collection of images of targets of interest which are used in classification of such objects in automotive radar data. It forms training data set for CNN-based classifiers. This will enable use of neural network based classifiers for path planning and collision avoidance. Data were collected in both controllable and not controllable environments. 
Type Of Material Database/Collection of data 
Year Produced 2019 
Provided To Others? Yes  
Impact It is just ready to be released within next month into public domain. 
 
Description Innovate UK Award TS/S007172/1 CO-existance Simulation MOdeling of Radars for Self driving - COSMOS -Partnership with JLR, Myrtle AI, White Horse Radar, NXP Glasgow 
Organisation Horiba
Department HORIBA MIRA
Country United Kingdom 
Sector Private 
PI Contribution UoB will lead the development of simulation approaches and models
Collaborator Contribution JLR will provide analysis of the L4/L5 scenarios, Gap analysis, Hardware and Software, Car HORIBA MIRA is responsible for development of simulation platform and testing and validation
Impact Just started
Start Year 2019
 
Description Innovate UK Award TS/S007172/1 CO-existance Simulation MOdeling of Radars for Self driving - COSMOS -Partnership with JLR, Myrtle AI, White Horse Radar, NXP Glasgow 
Organisation Jaguar Land Rover Automotive PLC
Department Jaguar Land Rover
Country United Kingdom 
Sector Private 
PI Contribution UoB will lead the development of simulation approaches and models
Collaborator Contribution JLR will provide analysis of the L4/L5 scenarios, Gap analysis, Hardware and Software, Car HORIBA MIRA is responsible for development of simulation platform and testing and validation
Impact Just started
Start Year 2019
 
Description Innovate UK award on TS/R017417/1 "COgnitive REal-time SENsing SystEm for autonomous vehicle control (CORE SENSE)" - collaborative project led by large industry, include 2 SMEs and subcontractors. 
Organisation Jaguar Land Rover Automotive PLC
Department Jaguar Land Rover
Country United Kingdom 
Sector Private 
PI Contribution This project aims to develop new techniques to extract, fuse and interpret sensor data in real-time in order to provide situational awareness with novel cognitive processing developed within the project. Due to our developments in novel cognitive sensing technology, the project will culminate in the demonstration of a JLR vehicle driving autonomously. University of Birmingham will lead the development of a close-to-market 79 GHz cognitive radar. So that expertise in radar beamforming, imaging and cognitive aspects, as well as experimentation for testing and validation.
Collaborator Contribution JLR lead the project and will provide inputs into interface with the control system of the vehicle, innovative path planning algorithms and lead demo WP.They will organize testing at the track in Gaydon. Their subcontracting will lead to other partners able to use close-to-the market chipset developed for automotive sensing.
Impact Project is at the set up stage, provisional start date is 1 July 2018
Start Year 2018
 
Description Innovate UK award on TS/R017417/1 "COgnitive REal-time SENsing SystEm for autonomous vehicle control (CORE SENSE)" - collaborative project led by large industry, include 2 SMEs and subcontractors. 
Organisation Myrtle Software Ltd
Country United Kingdom 
Sector Private 
PI Contribution This project aims to develop new techniques to extract, fuse and interpret sensor data in real-time in order to provide situational awareness with novel cognitive processing developed within the project. Due to our developments in novel cognitive sensing technology, the project will culminate in the demonstration of a JLR vehicle driving autonomously. University of Birmingham will lead the development of a close-to-market 79 GHz cognitive radar. So that expertise in radar beamforming, imaging and cognitive aspects, as well as experimentation for testing and validation.
Collaborator Contribution JLR lead the project and will provide inputs into interface with the control system of the vehicle, innovative path planning algorithms and lead demo WP.They will organize testing at the track in Gaydon. Their subcontracting will lead to other partners able to use close-to-the market chipset developed for automotive sensing.
Impact Project is at the set up stage, provisional start date is 1 July 2018
Start Year 2018
 
Title A system for use in a vehicle 
Description A system for use in a vehicle 10 determining the type of terrain 24 ahead of the vehicle is provided. The system comprises; a processor configured to receive sensor output data from a plurality of vehicle-mounted sensors 12, 22, including at least one radar sensor 22 and at least one acoustic sensor 12, each for receiving a reflected signal 20 from the terrain 24 ahead of the vehicle 10; and a data memory configured to store pre-determined data relating sensor output data, for the or each acoustic sensor 12 and the or each radar sensor 22, to a terrain type. The processor is configured to compare the sensor output data with the pre-determined data to determine an indication of the terrain type corresponding to the sensor output data. 
IP Reference GB2523092 
Protection Patent application published
Year Protection Granted 2015
Licensed Commercial In Confidence
Impact Currently is being commercialized at JLR
 
Title A system for use in a vehicle 
Description A system for use in a vehicle for determining an indication of the type of terrain in the vicinity of the vehicle is provided. The system comprises: means configured to receive sensor output data from at least one vehicle-mounted sensor 12, 14, 16, 22 which is configured to receive a reflected signal 20 from the terrain 24; means configured to calculate at least two parameters from the sensor output data; means configured to convert the at least two parameters to a data point for a cluster model comprising a plurality of clusters of pre-determined data points, wherein each cluster corresponds to a different terrain type; and means configured to define to which one of the clusters the data point belongs, so as to determine an indication of the terrain type. 
IP Reference GB2523091 
Protection Patent application published
Year Protection Granted 2015
Licensed Commercial In Confidence
Impact Currently is being commercialized
 
Title System for use in a vehicle 
Description A system, for profiling terrain 24 ahead of a vehicle 10, comprises receiving means such as antenna (58, fig 3) configured to receive sensor output data from a plurality of vehicle-mounted sensors, including at least one radar sensor 22 and at least one acoustic sensor 12, 16. Output data from the radar sensors 22 is used in a processor (32) to generate an image of the terrain 24 ahead of the vehicle 10, including any obstructions in the vehicles path. Output data from the acoustic sensors 12, 16 is then used to enhance clarity of the image that has been detected using the radar sensors 22. Output data from the radar sensor 22 is converted from Frequency Domain (FD) data into Time Domain (TD) data using an Inverse Fourier Transform (IFT) algorithm. A turntable moves the sensors 12,16, 22 angularly so that the sensors 12, 16, 22 transmits outputs having a plurality of different azimuthal angles. A HMI is connected to the processor (32) and alerts a driver to a presence of and/or location of a obstruction. Reference is also made to method and to a vehicle. 
IP Reference GB2523097 
Protection Patent application published
Year Protection Granted 2015
Licensed Yes
Impact A system, for profiling terrain 24 ahead of a vehicle is currently at the commercial stage of development by JLR.
 
Title Multi-receiver interferometric 300 GHz radar developed with ELVA-1 
Description Multi-receiver interferometric 300 GHz radar developed with ELVA-1 for 3 D image reconstruction 
Type Of Technology Systems, Materials & Instrumental Engineering 
Year Produced 2016 
Impact This will be used for 3D imaging in this project 
 
Title Multi-receiver interferometric and polarimetric 79 GHz radar developed in collaboration with ELVA mm-wave division 
Description This is a unique custom made multi-feature imaging radar system, which allows flexibility of the FMCW waveform design. This is reconfigurable to enable mutiple features and capabilities, such as improved classification of surfaces targets by measuring full polarimetrix matrix, as well as allows object dimension estimation by two separated receivers. 
Type Of Technology Systems, Materials & Instrumental Engineering 
Year Produced 2017 
Impact Created radar video from a car which will be used for CNN (convolutional Neural Networks) and DNN (deep Neural Networks) design (training data set first of all as analog to likes of KITTI for classification in optical streams) and testing.This will lead eventually to proposed transfer/cros-- learning of the project.When IP protected the database will be made available for wider community.Has already been demonstrated to TASCC (https://www.epsrc.ac.uk/newsevents/news/jlrannouncesautonomousvehicalresearchprogramme/) consortia and JLR at ShareFair event at British Motor museum 
 
Description Demonstration of technology at TASCC ShareFair in British Motor Museum 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Industry/Business
Results and Impact Interactive demonstration of the technology developed during two years of the project to TASCC (Towards Autonomy, Smart and Connected Control) consortia and Jaguar Land Rover.Demo included real-time radar, Lidar and video acquisition with real time processing to demonstrate performance of the proposed sensor suite to detect and classify real target. Radar sensor suite was mounted on a test vehicle with a number of displays in the hall to translate visuals and streamed radar imagery and processed data to the audience.
Year(s) Of Engagement Activity 2018
 
Description Opening lecture to 6th formers for the Engineering Education Scheme 
Form Of Engagement Activity Participation in an open day or visit at my research institution
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Schools
Results and Impact It was an opening lecture of the EES scheme to attract students to STEM disciplines.
Year(s) Of Engagement Activity 2017
 
Description Prof M. Gashinova is a char of MODEST group of UK Radar Community (EMSIG) 
Form Of Engagement Activity A formal working group, expert panel or dialogue
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Professional Practitioners
Results and Impact The scope of this focus group within the EMSIG network is to bring together academics, industrialists, and potential end-users working on development of radar technologies intended to provide sensing and ultimately situational awareness at short ranges, from tens of cm to hundreds of meters.
Key players in global race for autonomy are actively seeking technologies that can deliver situational awareness and mission planning through imaging, detection, tracking, positioning, and classification by fusing diverse technologies, including machine learning and adaptive computing platforms to monitor and to respond intelligently to the changing environment.
The group will focus on identifying current challenges and gaps, defining perspective directions of research and successful strategies for collaboration, complementarity of the expertise within the group, knowledge exchange. One of the intentions of the group activities is to form a recognized expert structure which can be involved in policymaking and regulatory activities within the UK and internationally.
Year(s) Of Engagement Activity 2018
 
Description article in the online technical newspaper - "the engineer" 
Form Of Engagement Activity A magazine, newsletter or online publication
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact In the article "Jaguar Land Rover leads £11m autonomous cars project" the whole programme were outlined and brief interview with the lead of the project (Dr. M.Gashinova) was given.
Year(s) Of Engagement Activity 2015
URL https://www.theengineer.co.uk/jaguar-land-rover-leads-11m-autonomous-cars-project/