DATA-CENTRIC: Developing AccounTAble Computational ENgineering Through Robust InferenCe

Lead Research Organisation: University College London
Department Name: Mathematics

Abstract

DATA-CENTRIC will fundamentally transform modern computational engineering through the development of algorithms that are accountable. This means algorithms capable of quantifying the uncertainty arising from computation itself, delivering simulations that are more transparent, traceable and at the same time more efficient. Crucial decisions in science, engineering, healthcare and public policy rely on established methodologies such as the Finite Element Method and the Stochastic Finite Element Method. However, the models that inform such decisions suffer from an inevitable loss of accuracy due to, and not limited to the following sources of uncertainty: a) time and cost constraints of running modern high-fidelity computer models, b) simplifying approximations necessary to translate mathematical models into computational models, and c) limited numerical precision inherent to any computer system. Therefore, there is a continuous risk of relying on unverified computational evidence, and the path from modelling to decision-making can be (inadvertently or unwillingly) obscured by the lack of accountability.
DATA-CENTIC will solve this problem through Probabilistic Numerics, a framework that will enable decision-makers to monitor, diagnose and control the quality of computer simulations. Probabilistic Numerics treats computation as a statistical problem, thus enriching computation with a probabilistic measure of numerical error. This idea is gathering momentum, especially in the UK. However, theoretical development are still in their early stages and except for a few examples, it has not been applied to solve large-scale industrial problems. Consequently, it has not yet been adopted by industry. DATA-CENTRIC will bridge this gap. . The proposed approach will provide radically new insights into the Finite Element Method and the Stochastic Finite Element Method. In particular, it will produce new solutions to industrial problems in Biomechanics and Robust Design. This has the potential of transforming personalised medicine and high-value manufacturing and will open the door to new industrial applications.

Planned Impact

Within the ongoing data revolution, computer models have become omnipresent and affect virtually every aspect of economic, social and scientific activity. It is now the norm to make critical decisions on the basis of computations that rely on complex models and vast amounts of data. In this context, the proposed research will benefit any decision-maker who relies on complex models to inform decisions. Given the wide range of application of engineering computations, in particular deterministic and stochastic Finite Element models, this Fellowship will actively engage with industry to deliver impact in Biomechanics and Robust Design. The impact in Biomechanics has the potential to advance the quest for truly personalised medicine by using clinical and imaging data to produce accountable computational models of biological tissue and organs. This will have scientific impact as it will integrate and quantify the high degree of variability in geometric and material properties and numerical uncertainty. It will also have societal impact, since accountable models will better inform decisions taken by surgeons and manufacturing companies for diagnostic visualisation, surgical planning and the design of implants and surgical equipment. The impact in Robust Design, particularly in the aerospace industry, will allow for the quantification of uncertainty that propagates through the analysis from the design stage to the manufacturing stage, thus delivering transparency in the transition from research to innovation.
The Fellowship will not be constrained to the aforementioned industries. The theoretical developments and industrial experience will be disseminated, through a one-day workshop, to small, medium and large-sized companies, government bodies and charities. Tailored tutorials will be delivered on-site to R&D departments of industrial partners. A three-day Study Group with industry will ensure that the wider community of industrial partners and academics can engage in a discussion, develop solutions in the framework of the proposed research and identify best practice. A webinar will be recorded such that the research is accessible to industrial and academic stakeholders, as well as the general public.

Publications

10 25 50

publication icon
Gong Z (2021) HISTORY MATCHING WITH SUBSET SIMULATION in International Journal for Uncertainty Quantification

publication icon
Hristov P (2019) Adaptive Gaussian process emulators for efficient reliability analysis in Applied Mathematical Modelling

 
Description One of the central topics within this award is rare-event simulation (RES), which is used to estimate probabilities of failure for complex physical systems. Our aim is to expand and improve RES so that it can also be used for uncertainty quantification (UQ), particularly in industrial settings. We have expanded the scope of RES by (a) proposing new algorithms that, taking advantage of the sampling efficiency inherent to RES, make it now possible to solve fundamental UQ problems such as: model calibration and multi-fidelity model; (b) engaging with industry to solve realistic UQ problems.

Firstly, we have published a paper on how to mitigate the cost of simulating rare events when the computational cost of the underlying model is prohibitively high. We have also provided strategies to select the model input configurations, such that the cost mitigation is robust and efficient. The paper is: "Adaptive Gaussian process emulators for efficient reliability analysis" (2019) Applied Mathematical Modelling, 71, pp 138-151.

A second paper, with focus on the nascent field of probabilistic numerics, was published. It was a contributed discussion on a probabilistic numerical algorithm. The reference is: Contributed Discussion of "A Bayesian Conjugate Gradient Method" (2019) Bayesian Analysis, 14 (3), pp 937-1012. Note that this contributed discussion is published alongside the paper that it discusses on (doi:10.1214/19-BA1145). The topic of this paper links directly with one of the objectives of the award, which is bridging the gap between the use of RES as a purely sampling device with its use as a UQ method.

To expand the range of application of RES within UQ, we have proposed a new calibration framework. The algorithm enhances the capabilities of History Matching, a framework to calibrate expensive computer models. The paper has been published as: "History matching with subset simulation" (2021) International Journal for Uncertainty Quantification, Volume 11, Issue 5.

We have also expanded the range of application of RES by proposing a new framework for multi-fidelity simulation. Through this work, we solved a problem in the robust design of a wind turbine blade. The paper was co-authored with researchers from General Electric. The details of the paper are: " Robust optimisation of computationally expensive models using adaptive multi-fidelity emulation" (2021) Applied Mathematical Modelling, 100, pp 92-106.

Last but not least, we have also proposed a new RES framework for models probabilistic output. This is particularly important for the estimation of probabilities of failure of complex systems. The algorithm quantifies numerical uncertainty derived from having a low fidelity model, and can be used for industrial purposes in order to provide substantial computational savings. The paper is in its last stages of writing and will be submitted shortly. (Update from March 2023: The paper has been submitted and has gone two rounds of replies to reviewers. It is currently under review).

To complement the above paper, software has been produced. A detailed implementation of the results contained therein are also the first implementation of probabilistic subset simulation and is now available openly to other researchers and also industry.
Exploitation Route The rare-event simulation framework that we are developing can be seamlessly applied to solve three problems: (1) model parameter updating (Bayesian inference); (2) calibration of expensive computer models; (3) estimation of probabilities of failure. These three problems are fundamental in science and industry. The practical applications range from designing more resilient engineering systems, building computational models that are better calibrated with experimental data and hence are more reliable, and making robust inference in statistical modelling.

I have now joined the Clinical Operational Research Unit at UCL, where I have started working with hospitals. My colleagues and I will develop models of the respiratory system, which will inevitably be computationally expensive. The techniques developed with the research on this grant will be fundamental in calibrating these models. The data available is from the intensive care units of UCL Hospital and Great Ormond Street Hospital. Being able to calibrate these models will allow clinicians to have robust tools to take decisions based on evidence.
Sectors Aerospace

Defence and Marine

Energy

Healthcare

Manufacturing

including Industrial Biotechology

URL https://gow.epsrc.ukri.org/NGBOViewGrant.aspx?GrantRef=EP/S001476/2
 
Description We have proposed a new framework for multi-fidelity simulation. Through this work, we solved a problem in the robust design of a wind turbine blade. We published a paper, co-authored with researchers from General Electric ( " Robust optimisation of computationally expensive models using adaptive multi-fidelity emulation" (2021) Applied Mathematical Modelling, 100, pp 92-106). We expect to collaborate further with GE such that our algorithm can be integrated into their in-house software. There are plans for further collaboration.
First Year Of Impact 2021
Sector Aerospace, Defence and Marine
Impact Types Economic

 
Description DSTL - PDRA Statistical Analysis
Amount £94,234 (GBP)
Organisation Defence Science & Technology Laboratory (DSTL) 
Sector Public
Country United Kingdom
Start 01/2019 
End 11/2022
 
Title psus 
Description The software accompanies the paper, currently under review, "Subset simulation for probabilistic computer models". It is a result of the work done by postdoctoral researcher Petar Hristov and PI Alejandro Diaz, for the EPSRC-funded project DATA-CENTRIC: Developing AccounTAble Computational ENgineering Through Robust InferenCe. 
Type Of Technology Software 
Year Produced 2022 
Open Source License? Yes  
Impact This software is the first implementation of probabilistic subset simulation. It allows to perform rare event simulation based on probabilistic numerical principles. This means that a measure of uncertainty that stems solely from computation can now be attached to reliability analysis, optimisation and model calibration. 
URL https://github.com/PeterHristov/psus