CHAI - EPSRC AI Hub for Causality in Healthcare AI with Real Data

Lead Research Organisation: University of Edinburgh
Department Name: Sch of Engineering

Abstract

The current AI paradigm at best reveals correlations between model input and output variables. This falls short of addressing health and healthcare challenges where knowing the causal relationship between interventions and outcomes is necessary and desirable. In addition, biases and vulnerability in AI systems arise, as models may pick up unwanted, spurious correlations from historic data, resulting in the widening of already existing health inequalities.
Causal AI is the key to unlock robust, responsible and trustworthy AI and transform challenging tasks such as early prediction, diagnosis and prevention of disease.

The Causality in Healthcare AI with Real Data (CHAI) Hub will bring together academia, industry, healthcare, and policy stakeholders to co-create the next-generation of world-leading artificial intelligence solutions that can predict outcomes of interventions and help choose personalised treatments, thus transforming health and healthcare. The CHAI Hub will develop novel methods to identify and account for causal relationships in complex data. The Hub will be built by the community for the community, amassing experts and stakeholders from across the UK to
1) push the boundaries of AI innovation;
2) develop cutting-edge solutions that drive desperately needed efficiency in resource-constrained healthcare systems; and
3) cement the UK's standing as a next-gen AI superpower.

The data complexity in heterogeneous and distributed environments such as healthcare exacerbates the risks of bias and vulnerability and introduces additional challenges that must be addressed. Modern clinical investigations need to mix structured and unstructured data sources (e.g. patient health records, and medical imaging exams) which current AI cannot integrate effectively. These gaps in current AI technology must be addressed in order to develop algorithms that can help to better understand disease mechanisms, predict outcomes and estimate the effects of treatments. This is important if we want to ensure the safe and responsible use of AI in personalised decision making.

Causal AI has the potential to unearth novel insights from observational data, formalise treatment effects, assess outcome likelihood, and estimate 'what-if' scenarios. Incorporating causal principles is critical for delivering on the National AI Strategy to ensure that AI is technically and clinically safe, transparent, fair and explainable.

The CHAI Hub will be formed by a founding consortium of powerhouses in AI, healthcare, and data science throughout the UK in a hub-spoke model with geographic reach and diversity. The hub will be based in Edinburgh's Bayes Centre (leveraging world-class expertise in AI, data-driven innovation in health applications, a robust health data ecosystem, entrepreneurship, and translation). Regional spokes will be in Manchester (expertise in both methods and translation of AI through the Institute for Data Science and AI, and Pankhurst Institute), London (hosted at KCL, representing also UCL and Imperial, leveraging London's rapidly growing AI ecosystem) and Exeter (leveraging strengths in philosophy of causal inference and ethics of AI).

The hub will develop a UK-wide multidisciplinary network for causal AI. Through extended collaborations with industry, policymakers and other stakeholders, we will expand the hub to deliver next-gen causal AI where it is needed most. We will work together to co-create, moving beyond co-ideation and co-design, to co-implementation, and co-evaluation where appropriate to ensure fit-for-purpose solutions

Our programme will be flexible, will embed trusted, responsible innovation and environmental sustainability considerations, will ensure that equality diversity and inclusion principles are reflected through all activities, and will ensure that knowledge generated through CHAI will continue to have real-world impact beyond the initial 60 months.

Organisations

People

ORCID iD

Sotirios Tsaftaris (Principal Investigator) orcid http://orcid.org/0000-0002-8795-9294
Niccolo Tempini (Co-Investigator) orcid http://orcid.org/0000-0002-5100-5376
Sjoerd Beentjes (Co-Investigator) orcid http://orcid.org/0000-0002-7998-4262
Niels Peek (Co-Investigator) orcid http://orcid.org/0000-0002-6393-9969
Anita Grigoriadis (Co-Investigator)
Bruce Guthrie (Co-Investigator) orcid http://orcid.org/0000-0003-4191-4880
Sohan Seth (Co-Investigator)
Hui Guo (Co-Investigator)
Ava Khamseh (Co-Investigator) orcid http://orcid.org/0000-0001-5203-2205
Yulan He (Co-Investigator)
Eliana Vasquez Osorio (Co-Investigator) orcid http://orcid.org/0000-0003-0741-994X
Henry Gouk (Co-Investigator) orcid http://orcid.org/0000-0002-0924-2933
Stephan Guttinger (Co-Investigator) orcid http://orcid.org/0000-0001-9448-973X
William Whiteley (Co-Investigator) orcid http://orcid.org/0000-0002-4816-8991
Karla Diaz-Ordaz (Co-Investigator) orcid http://orcid.org/0000-0003-3155-1561
Ewen Harrison (Co-Investigator) orcid http://orcid.org/0000-0002-5018-3066
Ginny Russell (Co-Investigator) orcid http://orcid.org/0000-0002-6440-1167
Ian Simpson (Co-Investigator) orcid http://orcid.org/0000-0003-0495-7187
John Kenneth Baillie (Co-Investigator) orcid http://orcid.org/0000-0001-5258-793X
Elliot Crowley (Co-Investigator)
Javier Escudero Rodriguez (Co-Investigator) orcid http://orcid.org/0000-0002-2105-8725
Ben Glocker (Co-Investigator)
Matthew Sperrin (Co-Investigator)
Kianoush Nazarpour (Co-Investigator) orcid http://orcid.org/0000-0003-4217-0254
Yingzhen Li (Co-Investigator) orcid http://orcid.org/0000-0001-6938-2375
Subramanian Ramamoorthy (Co-Investigator)
Hana Chockler (Co-Investigator)
Daniel Alexander (Co-Investigator)
Sabina Leonelli (Co-Investigator)
Ricardo Silva (Co-Investigator)
Neil Chue Hong (Researcher) orcid http://orcid.org/0000-0002-8876-7606

Publications

10 25 50