Securing Connected and Autonomous Vehicles Against Adversarial Attacks: Time Series Anomaly Detection
Lead Research Organisation:
UNIVERSITY COLLEGE LONDON
Department Name: Computer Science
Abstract
My first-year research project presents a systematic review of the current state of anomaly detection in connected and autonomous vehicles (CAVs), focusing on the methods used, how these models are trained, and how they are evaluated. My initial database search yielded 1911 articles, of which 226 were included in the review after rigorous screening and assessment. The review revealed that the most commonly used AI algorithms for anomaly detection were long short-term memory, convolutional neural network, autoencoder, one-class support vector machine, and support vector machine. Furthermore, anomaly detection models were predominantly trained using real-world data. However, attacks and faults were rarely introduced through real-world data but were randomly injected or simulated. Finally, the top five evaluation metrics were recall, accuracy, precision, F1-score, and false positive rate. Accuracy was the most common standalone measure-when looking at the selection of metrics used for each paper.
The review also presents several recommendations. First, there is a need to incorporate multiple evaluation metrics to provide a comprehensive assessment of the anomaly detection models. Second, data description across papers needs to be uniform to facilitate meaningful comparisons and benchmarking across studies. Third, collection date needs to be included to provide an understanding of the temporal context in a rapidly evolving field. Fourth, this review has revealed that only a small proportion of the studies open source their models. There should be a drive to share models publicly to facilitate collaboration in the research community. Finally, a specification of the autonomy level should be emphasised to provide an understanding of the level of complexity involved.
Building on my systematic review of anomaly detection for connected and autonomous vehicles (CAVs), my PhD thesis will aim to create a time series anomaly detection model for CAVs. In particular, my project will focus on the security of CAVs, rather than safety, securing the vehicle against adversarial attacks as opposed to faults. Attacks on the image domain have received the most attention and research compared to the under-explored domain of time series adversarial attacks. For a computer to check for anomalies, it monitors the sensor inputs and the controller area network (CAN bus) log using machine learning models. Anomaly detection systems are designed to find irregularities brought on by attackers. If the CAV system displays unusual behaviour after the anomaly detection model has learned how it typically functions, the machine learning model can alert the user to a probable system problem.
The review also presents several recommendations. First, there is a need to incorporate multiple evaluation metrics to provide a comprehensive assessment of the anomaly detection models. Second, data description across papers needs to be uniform to facilitate meaningful comparisons and benchmarking across studies. Third, collection date needs to be included to provide an understanding of the temporal context in a rapidly evolving field. Fourth, this review has revealed that only a small proportion of the studies open source their models. There should be a drive to share models publicly to facilitate collaboration in the research community. Finally, a specification of the autonomy level should be emphasised to provide an understanding of the level of complexity involved.
Building on my systematic review of anomaly detection for connected and autonomous vehicles (CAVs), my PhD thesis will aim to create a time series anomaly detection model for CAVs. In particular, my project will focus on the security of CAVs, rather than safety, securing the vehicle against adversarial attacks as opposed to faults. Attacks on the image domain have received the most attention and research compared to the under-explored domain of time series adversarial attacks. For a computer to check for anomalies, it monitors the sensor inputs and the controller area network (CAN bus) log using machine learning models. Anomaly detection systems are designed to find irregularities brought on by attackers. If the CAV system displays unusual behaviour after the anomaly detection model has learned how it typically functions, the machine learning model can alert the user to a probable system problem.
Planned Impact
The EPSRC Centre for Doctoral Training in Cybersecurity will train over 55 experts in multi-disciplinary aspects of cybersecurity, from engineering to crime science and public policy.
Short term impacts are associated with the research outputs of the 55+ research projects that will be undertaken as part of the doctoral studies of CDT students. Each project will tackle an important cybersecurity problem, propose and evaluate solutions, interventions and policy options. Students will publish those in international peer-reviewed journals, but also disseminate those through blog posts and material geared towards decision makers and experts in adjacent fields. Through industry placements relating to their projects, all students will have the opportunity to implement and evaluate their ideas within real-world organizations, to achieve short term impact in solving cybersecurity problems.
In the longer term graduates of the CDT will assume leading positions within industry, goverment, law enforcement, the third sector and academia to increase the capacity of the UK in being a leader in cybersecurity. From those leadership positions they will assess options and formulate effective interventions to tackle cybercrime, secure the UK's infrastructure, establish norms of cooperation between industries and government to secure IT systems, and become leading researcher and scholars further increasing the UK's capacity in cybersecurity in the years to come. The last impact is likely to be significant give that currently many higher education training programs do not have capacity to provide cybersecurity training at undergraduate or graduate levels, particularly in non-technical fields.
The full details of our plan to achieve impact can be found in the "Pathways to Impact" document.
Short term impacts are associated with the research outputs of the 55+ research projects that will be undertaken as part of the doctoral studies of CDT students. Each project will tackle an important cybersecurity problem, propose and evaluate solutions, interventions and policy options. Students will publish those in international peer-reviewed journals, but also disseminate those through blog posts and material geared towards decision makers and experts in adjacent fields. Through industry placements relating to their projects, all students will have the opportunity to implement and evaluate their ideas within real-world organizations, to achieve short term impact in solving cybersecurity problems.
In the longer term graduates of the CDT will assume leading positions within industry, goverment, law enforcement, the third sector and academia to increase the capacity of the UK in being a leader in cybersecurity. From those leadership positions they will assess options and formulate effective interventions to tackle cybercrime, secure the UK's infrastructure, establish norms of cooperation between industries and government to secure IT systems, and become leading researcher and scholars further increasing the UK's capacity in cybersecurity in the years to come. The last impact is likely to be significant give that currently many higher education training programs do not have capacity to provide cybersecurity training at undergraduate or graduate levels, particularly in non-technical fields.
The full details of our plan to achieve impact can be found in the "Pathways to Impact" document.
Organisations
People |
ORCID iD |
| John Solaas (Student) |
Studentship Projects
| Project Reference | Relationship | Related To | Start | End | Student Name |
|---|---|---|---|---|---|
| EP/S022503/1 | 31/03/2019 | 23/11/2028 | |||
| 2726649 | Studentship | EP/S022503/1 | 25/09/2022 | 29/09/2027 | John Solaas |