Explainable Deep Learning for Temporal Situational Understanding

Lead Research Organisation: Cardiff University
Department Name: Computer Science

Abstract

Huge advances have been made in machine learning in recent years due to breakthroughs in deep neural networks, called Deep Learning (DL). However, a key problem with DL approaches is that they are generally seen as being "black boxes": while they may work well in particular applications, it is usually unclear how they work, leading to challenges in improving their performance when they fail, and issues of user trust. There is consequently great interest in researching techniques to improve the interpretability of DL approaches to allow DL systems to generate explanations of how they reached a decision. To be useful, such explanations need to be generated in human-understandable terms, for example, identifying image features that were significant in a classification decision, or providing a brief textual description. They should also allow a human to take appropriate action upon receipt of the explanation; for example, satisfying transparency requirements for accountable decision-making or, in the event of a misclassification providing useful information on how to improve future performance of the DL system. The goal of this PhD is to make progress in this challenging area of DL, with a particular focus on situational understanding problems where the DL system is intended to assist a human decision maker in domains such as emergency response, security and policing.
Situational understanding requires three key elements in terms of machine learning, both of which need to be explainable: (1) learning of temporal relationships, including predictions of likely future states (i.e., based on the current situation, what is likely to happen next); (2) learning at multiple hierarchical scales, from detection of low-level objects to identification of high-level relationships; and (3) learning from the fusion of multiple data streams of different modalities (e.g., imagery, text, GPS, etc). As an example, consider the problem of managing a large-scale event in a city centre, where streams of CCTV imagery, social media text, and real-time location data may be used to predict potential overcrowding and consequential disruption. This objective of this PhD will be to focus in particular on (1) - explainability of learned temporal relationships.
This PhD will be carried out within the Distributed Analytics and Information Sciences International Technology Alliance (DAIS ITA), a collaboration between Cardiff University, IBM, Airbus, BAE Systems, University College London, University of California Los Angeles, and other UK and US partners - see https://dais-ita.org/ At Cardiff University the PhD will be supervised by members of the Crime and Security Research Institute, the School of Computer Science and Informatics, and the School of Engineering's Sensors, Signals and Imaging Group.

Publications

10 25 50

Studentship Projects

Project Reference Relationship Related To Start End Student Name
EP/R513003/1 01/10/2018 30/09/2023
2123459 Studentship EP/R513003/1 01/10/2018 01/04/2022 Liam Hiley