Adaptive Verification of Autonomous Systems

Lead Research Organisation: University of Bristol
Department Name: Computer Science

Abstract

With the increasing degree of autonomy in systems (especially the ones entailing machine
learning), it is becoming crucial to develop methodologies to gain confidence in the safety of
such systems and their functional and behavioural correctness/adequacy. One of the
challenging tasks in developing autonomous systems is the inability of accomplishing full
coverage during physical/virtual testing. As a consequence, the expected behaviour
established for these systems at design time may turn out to be violated during operation.
External connectivity and Machine Learning (ML) both change the game entirely due to
unbounded scope, unknown interactions, unpredictable functions, etc. There is a need for
new Design for Operation methods to deal with connectivity at scale, non-deterministic
functions, and have failure 'Mitigation Plan/s'. These new 'Mitigation Plans' may need to be
designed (retraining for ML part) and deployed pro-actively and/or reactively. Through-life
behaviour monitoring may be required to deal with unwanted states and fast evolving
threats and managing uncertainties that may not be known in advance.
Continuously monitoring of operating state can be followed by assessing, devising and
deploying a 'Mitigation Plan' that must constitute a 'Good Outcome'. These new 'Mitigation
Plan/s' need to be verified prior to their deployment. Overall, the challenge is to maintain
operation in the face of imminent and potential threats.

Publications

10 25 50

Studentship Projects

Project Reference Relationship Related To Start End Student Name
EP/S513763/1 01/10/2018 24/04/2024
2343461 Studentship EP/S513763/1 01/10/2019 31/12/2023 Abanoub Ghobrial