Fast Bayesian Inference at Extreme Scale
Lead Research Organisation:
University of Bristol
Department Name: Mathematics
Abstract
This research aims to advance the state-of-the-art in those AI methodologies which furnish both accurate predictions and high-quality uncertainty representations for the predictions made. The statistical family of Bayesian inference algorithms is well placed to deliver both of these requirements, but more needs to be done to extend these algorithms to 'extreme scale' whilst at the same time maintaining high accuracy and well calibrated measures of uncertainty.
Two notable Bayesian inference algorithms are Gaussian processes (GPs) and Bayesian Neural Networks (BNNs). GPs provide well-principled uncertainty representations but in their exact form do not scale well - although there has been recent good progress in this direction using novel distributed linear algebra implementations at the million-data-point level. This research will be geared toward applications with the order of a million data-points plus, which is where an exact GP currently struggles. To deal with these scaling difficulties various approximations to GPs have been developed, all involving a trade-off between quality of prediction and speed of implementation.
Setting aside scaling difficulties, the effective uncertainty representations provided by GPs are extremely advantageous. But this situation is turned on its head when we look at deep neural networks. If we set aside any requirement for uncertainty measures then deep neural networks scale very well to extreme scale. However, reliable uncertainty representations are much harder to come by with deep learning and Bayesian Neural Networks (BNN).
Given this background the research aim is to extend the state-of-the-art in large scale Bayesian inference algorithms, aiming for (1) improvements in speeds on data-sets of size one million plus (and with good scalability characteristics beyond) ; (2) improvements in reliability both in terms of prediction accuracy and provision of well calibrated uncertainty representations (3) user friendly tuning to strike an optimised balance between model performance and training/testing costs as the data-set sizes increases.
Two notable Bayesian inference algorithms are Gaussian processes (GPs) and Bayesian Neural Networks (BNNs). GPs provide well-principled uncertainty representations but in their exact form do not scale well - although there has been recent good progress in this direction using novel distributed linear algebra implementations at the million-data-point level. This research will be geared toward applications with the order of a million data-points plus, which is where an exact GP currently struggles. To deal with these scaling difficulties various approximations to GPs have been developed, all involving a trade-off between quality of prediction and speed of implementation.
Setting aside scaling difficulties, the effective uncertainty representations provided by GPs are extremely advantageous. But this situation is turned on its head when we look at deep neural networks. If we set aside any requirement for uncertainty measures then deep neural networks scale very well to extreme scale. However, reliable uncertainty representations are much harder to come by with deep learning and Bayesian Neural Networks (BNN).
Given this background the research aim is to extend the state-of-the-art in large scale Bayesian inference algorithms, aiming for (1) improvements in speeds on data-sets of size one million plus (and with good scalability characteristics beyond) ; (2) improvements in reliability both in terms of prediction accuracy and provision of well calibrated uncertainty representations (3) user friendly tuning to strike an optimised balance between model performance and training/testing costs as the data-set sizes increases.
People |
ORCID iD |
Anthony Stephenson (Student) |
Studentship Projects
Project Reference | Relationship | Related To | Start | End | Student Name |
---|---|---|---|---|---|
EP/V519650/1 | 30/09/2020 | 29/09/2027 | |||
2448776 | Studentship | EP/V519650/1 | 30/09/2020 | 29/09/2024 | Anthony Stephenson |