Causal and Counterfactual Reasoning for Fairness

Lead Research Organisation: University of Oxford
Department Name: Statistics

Abstract

The question of whether an institution makes decisions for fair reasons is one that increasingly needs to be tackled in the world today. Are people being hired for jobs on the basis of their talents not their gender? Are university admissions based off applicant's academic ability, or their schooling/socio-economic background? Does the judicial system punish people more harshly solely on the basis of their race? Questions like this have been asked for many years, and now in a world where algorithms make recommendations for these questions it is even more important to have sound statistical methods to test for evidence of bias that do not rely on being able to ask the decision-maker for the reasons behind particular decisions. The initial aim of the project will be to develop such tests using causal and counterfactual reasoning with a view to researching how we can construct algorithms that will make fair decision.

The field of causal inference can provide us with principled tools to think about questions of fairness and will form a key foundation for the project. Causal inference allows us to mathematically formalise and, in some cases, answer counterfactual questions such as: Would someone's chances of getting a job interview change if their gender was different? Questions such as this form the basis for some simple, non-statistical tests that are used for fairness. For example, a classic test for bias in the job market would be to send out multiple applications with two CVs that are identical apart from the name, with one name suggesting it is the CV of a man and one a woman. If a vastly different number of interview requests are received for one CV over the other, we would feel confident that this is due to some kind of bias in the hiring procedure. But these studies are trying to get at exactly the same counterfactual question we have just posed, would someone's chances of getting a job interview change if their gender was different? This example demonstrates how these counterfactual questions capture an intuitive definition of fairness.

Now suppose for the above example we were to do the simple experiment and collect data on the number of responses for each CV, we would need some way to quantify if the number of responses is "vastly different". To do this, we would use a statistical hypothesis test which aims to see if there is evidence that the difference is notable, or if it could just be due to chance? The question we are trying to answer here is if there is evidence that the distribution of the responses is different under the two CVs. In this project the aim will be to use kernel methods to produce tests to see if data shows evidence the distribution is different under the counterfactual questions we will pose using the causal framework. We use kernel methods as they allow us to test this question in more complex settings without making any assumptions about the underlying distributions.

The majority of this research will be the theoretical formation of these tests. However, it will also be important to test them computationally to ensure they work correctly. This could be done on real world data sets or by simulating data to emulate bias and then see if the tests can pick up on this. Following the experiments, it may be beneficial to release the code base in order that these tests can be used in the wider world to check for unfair decision making and to promote fairer decision making generally.

This project will fall under the EPSRC Research Area of 'Statistics and Applied Probability'.

Publications

10 25 50

Studentship Projects

Project Reference Relationship Related To Start End Student Name
EP/N509711/1 01/10/2016 30/09/2021
2444561 Studentship EP/N509711/1 01/10/2020 31/03/2025 Jake Fawkes
EP/R513295/1 01/10/2018 30/09/2023
2444561 Studentship EP/R513295/1 01/10/2020 31/03/2025 Jake Fawkes
EP/V520202/1 01/10/2020 31/10/2025
2444561 Studentship EP/V520202/1 01/10/2020 31/03/2025 Jake Fawkes