PLEAD: Provenance-driven and Legally-grounded Explanations for Automated Decisions
Lead Research Organisation:
University of Southampton
Department Name: Southampton Law School
Abstract
Abstracts are not currently available in GtR for all funded research. This is normally because the abstract was not required at the time of proposal submission, but may be because it included sensitive information such as personal details.
Organisations
People |
ORCID iD |
Sophie Stalla-Bourdillon (Principal Investigator) |
Publications
Alam MS
(2022)
Repurposing of existing antibiotics for the treatment of diabetes mellitus.
in In silico pharmacology
Canal G
(2020)
Building Trust in Human-Machine Partnerships
in Computer Law & Security Review
Huynh T
(2021)
Addressing Regulatory Requirements on Explanations for Automated Decisions with Provenance-A Case Study
in Digital Government: Research and Practice
Trung Dong Huynh
(2022)
Explainability-by-Design: A Methodology to Support Explanations in Decision-Making Systems
Tsakalakis N
(2022)
A taxonomy of explanations to support Explainability-by-Design
Tsakalakis N
(2022)
Data Protection and Privacy - Enforcing Rights in a Changing World
Tsakalakis N
(2021)
The dual function of explanations: Why it is useful to compute explanations
in Computer Law & Security Review
Tsakalakis N
(2020)
Explanations for AI: Computable or Not?
Tsakalakis N
(2022)
A taxonomy of explanations to support Explainability-by-Design
Title | PLEAD Explanation Assistant |
Description | Unhappy when "computer says no"? We're developing a provenance-based approach to provide the public with explanations for automated decisions, which takes into account their legal rights. |
Type Of Art | Film/Video/Animation |
Year Produced | 2021 |
Impact | The Video was published on Vimeo and was used to advertise Industry webinars |
URL | https://vimeo.com/553273420 |
Description | 1. Explainability by Design is a socio-technological methodology characterised by proactive measures aiming to embed explanation-generation capabilities in the design of automated decision-making pipelines rather than bolting explanation capabilities after the facts through reactive measures. It is organised in three phases containing concrete steps to guide organisations when identifying, designing, and deploying explanation capabilities for each of their applications to addresses clearly defined regulatory and business requirements. 2. An explanation taxonomy was developed to systematically characterise explanations from explainable requirements. The taxonomy follows a multi-dimensional approach, with each dimension describing a different component of an explanation in order to produce valid and meaningful explanations. It is policy-agnostic, regulation-agnostic, and can accommodate explainable requirements stemming from compliance, policy, or business needs. 3. TheExplainability-by-Design methodology has been applied to two scenarios of automated decision making: credit card applications and school placement. In particular, requirements for explanations were identified and categorised using the above taxonomy in the context of GDPR and other applicable regulations. The technical design steps of the methodology were then employed to implement those requirements, resulting in an explanation service deployed as an online demonstrator of the methodology. 4. Two user studies were run with the staff of Experian and Southampton City Council to compare the current practice to the PLEAD explanations generated for the two scenarios above, respectively. The studies confirmed a current gap in transparency and decision monitoring and auditing. The PLEAD explanations were deemed by the participants as a potential answer to address this gap while improving the personalisation and user-friendliness of the delivery |
Exploitation Route | he taxonomy of explanations and associated vocabulary is available for the community to extend and build systes upon. The methodology serves as guidance for businesses and organisations to enhance their systems with explainability capabilities. |
Sectors | Aerospace, Defence and Marine,Communities and Social Services/Policy,Creative Economy,Digital/Communication/Information Technologies (including Software),Education,Financial Services, and Management Consultancy,Healthcare,Government, Democracy and Justice,Retail |
URL | https://plead-project.org/ |
Description | During a pilot project with Deep Blue C Technology Ltd and UK Navy's Maritime Warfare Centre, we applied the explanation generation and provenance querying capability developed in PLEAD on war games. We demonstrated to the defence partners how PLEAD explanations and queries could help automate some post-hoc analysis of complex war games, which is still being done manually. This led to follow-on funding application to develop a testbed for human-AI teaming building on those capabilities in 2023. The proposed work will be done in collaboration with John Hopkins University's Applied Physics Laboratory, the Naval Information Warfare Center (NIWC), and the US Naval Postgraduate School. This activity aims to bring in PLEAD explanations and supporting technology developed during the project to enable automated and semi-automated evaluation of team performance, allowing new AI-supported functionality to be quickly tested with human stakeholders in war games. |
First Year Of Impact | 2022 |
Sector | Aerospace, Defence and Marine |
Description | Inputs to the ICO's Guidance on Explaining decisions made with AI |
Geographic Reach | National |
Policy Influence Type | Citation in other policy documents |
URL | https://ico.org.uk/for-organisations/guide-to-data-protection/key-data-protection-themes/explaining-... |
Description | TAS Agile Defence Call |
Amount | £125,627 (GBP) |
Organisation | Defence Science & Technology Laboratory (DSTL) |
Sector | Public |
Country | United Kingdom |
Start | 05/2023 |
End | 05/2024 |
Title | PLEAD Explanation Assistant Demonstrator |
Description | The demonstrator showcases the provenance-based explanations that are generated for the two scenarios investigated by PLEAD: credit card application and school allocation. A user can play the role of a consumer going through the process of applying for a credit card, or a parent applying for a school place for their child. In either case, the user will receive a simulated decision at the end of the process with legally-driven explanations generated by the PLEAD Explanation Assistant for the decision built from its provenance. We also expose the provenance recorded for each of the decisions to show the data that underlies the explanations presented to the users. |
Type Of Technology | Webtool/Application |
Year Produced | 2021 |
Impact | The demonstrator is available online and is cited in the ICO's guidance on "Explaining Decisions Made with AI" (https://ico.org.uk/for-organisations/guide-to-data-protection/key-dp-themes/explaining-decisions-made-with-artificial-intelligence/part-2-explaining-ai-in-practice/task-2-collect/). We have used the demonstrator (and share it widely) in various events to showcase the kind of explanations that can be produced from provenance by the PLEAD Explanation Assistant. The online demonstrator will be available for the foreseeable future, after the project finishes, and can be accessed by anyone over the web. |
Description | A Provenance Tutorial, at ODSC conference |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Professional Practitioners |
Results and Impact | This was a Provenance Tutorial delivered at the Open Data Science Conference in November 2019, London. While the live talk was attended by 50 participants in the room, the video recording was made available to the thousands of participants/ |
Year(s) Of Engagement Activity | 2019 |
URL | https://www.youtube.com/watch?v=v8noJFOYFEk |
Description | Explanations for AI: Computable or Not? |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | Organisation of workshop in Web Science 2020 conference. About 100 participants attended the workshop, with invited speakers arguing for and against computational explanations. The interactive discussion sparked questions about the rights of the individuals, good practices for AI explainability and solutions for better explainability. |
Year(s) Of Engagement Activity | 2020 |
Description | Explanations on the Web: A Provenance-based Approach. Keynote at WEBIST 2020 |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Professional Practitioners |
Results and Impact | Keynote at the WebIST conference. Due to the pandemic, the event was held virtually. |
Year(s) Of Engagement Activity | 2020 |
URL | https://vimeo.com/478043709 |
Description | Explanations on the Web: a Provenance-based Approach, Expert Seminar Series, University of Warwick |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Professional Practitioners |
Results and Impact | The Cyber Security GRP hosts seminars throughout the academic year, open to all University staff. Presenters include high-profile external speakers and internal staff. Each event is focused on a topic of their choice related to their current research interests, followed by a Q&A. |
Year(s) Of Engagement Activity | 2021 |
URL | https://warwick.ac.uk/research/priorities/cyber-security/seminarseries/ |
Description | Industry showcase webinars on 15/6 and 22/6/2021 |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Industry/Business |
Results and Impact | We ran two public webinars targeting representatives of business, government, local government and communities, and regulators interested in explainable AI and outputs of PLEAD research. The objectives of the events are to share the findings and future potential of PLEAD with relevant audiences and potential users in industry and government; build a network of industry contacts with an interest in the outputs of PLEAD; gain an understanding of how to translate PLEAD findings into practical solutions for business and policy. After showcasing our approach to support explainability in decision-making systems and demonstrating provenance-driven explanations for credit card applications, we had lively discussions with the audience at both events. |
Year(s) Of Engagement Activity | 2021 |
URL | https://vimeo.com/571299673 |
Description | Invited talk on Explainabilty by design |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Industry/Business |
Results and Impact | Some 120 people attended the talk on Explainabilty by Design. During the session questions pertained to (1) provenance and knowledge representation (2) the applicability of the methodology to machine learning (3) the evaluation of the methodology, the size of the explanations and associated cost with designing them. After the session, further interest was expressed by representatives from various sectors: finance, autonomotive, and law enforcement. There was also interest about potential executive education. |
Year(s) Of Engagement Activity | 2022 |
URL | https://odsc.com/blog/explainability-by-design-a-methodology-to-support-explanations-in-decision-mak... |
Description | King's Festival of Disruptive Thinking: Computer Says No on 3/11/2021 |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | Regional |
Primary Audience | Public/other audiences |
Results and Impact | This is a public engagement event, as part of the Festival of Disruptive Thinking organised by King's College London, which was designed to facilitate conversations between researchers and the public relating to automated decision making. The participants were presented with two case studies in which automated decisions took place: school placement and financial services. They were also shown a demo of explanations facilitated by the PLEAD research. The audience was invited to vote on a series of questions and express their ideas and concerns about automated decision making. |
Year(s) Of Engagement Activity | 2021 |
URL | https://www.kcl.ac.uk/events/computer-says-no |
Description | Meeting on 9/12/2021 with Naval Information Warfare Center Pacific (NIWC), San Diego, US |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Professional Practitioners |
Results and Impact | Present the provenance-driven explanation capability developed in PLEAD to senior researchers from the US Navy at NIWC with the aim to identify potential exploitation of the technology in future joint work. The following discussion led to the potential use of provenance in (1) virtual war gaming exercises and (2) creating "model card" narratives for machine learning models. NIWC then introduced us to the UK Royal Navy's Maritime Warfare Centre (MWC) in HMS Collingwood, Gosport to explore a collaboration with them with respect to (1). This led to a funding award by King's College London (£12,464) for the team to work with MWC and its contractor, Deep Blue C Ltd, to instrument provenance in their wargaming platform and to demonstrate narratives generated from war gaming exercises. |
Year(s) Of Engagement Activity | 2021 |
Description | PLEAD project presentation at the Datum Future online event organised by Experian |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Professional Practitioners |
Results and Impact | Presentation delivered by Niko Tsakalakis and Luc Moreau to the Datum Future event organised by Experian. Experian are a PLEAD project collaborator, participants were technologies/data scientists from the financial and IT industries. |
Year(s) Of Engagement Activity | 2020 |
Description | Provenance-based Explanations for Automated Decisions: A presentation to the Regulators for AI, at the Competition and Markets Authority |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Policymakers/politicians |
Results and Impact | Following our work on provenance-based explanations, in collaboration with the ICO, we were invited to present the result to the regulators for AI. In the talk, we introduced the PLEAD project. |
Year(s) Of Engagement Activity | 2019 |
Description | Provenance-based Explanations for Automated Decisions: Provenance Seminar at University of Rioja |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | An talk delivered as part of a series of talks on provenance, at the University of Rioja, in the context of the public PhD defense of Carlos Sáenz-Adán |
Year(s) Of Engagement Activity | 2019 |
Description | Provenance: a fundamental data governance tool. A case study for data science pipelines and their explanations, a keynote |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Professional Practitioners |
Results and Impact | The Open Data Science Conference is a major event for the data science community |
Year(s) Of Engagement Activity | 2020 |
URL | https://odsc.com/ |
Description | SDTaP (PETRAS, Innovate UK and EPSRC ) Networking Lunch and Global State of IoT Governance Workshop - Tuesday 11 February 2020 |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Professional Practitioners |
Results and Impact | LIghtning talk delivered by Luc Moreau |
Year(s) Of Engagement Activity | 2020 |
Description | Southampton Science and Engineering Festival: Computer says no |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | Regional |
Primary Audience | Public/other audiences |
Results and Impact | This is a citizen jury event, as part of the Southampton Science and Engineering Festival 2021, which was designed to facilitate conversations between researchers and the public relating to automated decision making. 12 members of the public attended. They were presented with three case studies in which automated decisions took place: employment, school placement, and financial services. The audience was invited to vote on a series of questions and express their ideas and concerns about automated decision making. |
Year(s) Of Engagement Activity | 2021 |
URL | https://www.eventbrite.co.uk/e/computer-says-no-are-you-happy-with-that-verdict-registration-1390660... |
Description | The PROV-JSONLD Serialization, at presentation at the IPAW workshop. |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Professional Practitioners |
Results and Impact | This was the presentation of a paper accepted at the IPAW workshop. The event was held virtually due to the pandemic. |
Year(s) Of Engagement Activity | 2020 |
URL | https://www.youtube.com/watch?v=7oERpRFW9u4 |
Description | The dual function of explanations: why it is useful to compute explanations? |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Professional Practitioners |
Results and Impact | Presentation of our paper "The dual function of explanations: why it is useful to compute explanations" at the CPDP'21. The presentation was attended by >100 people and sparked discussions on the current state of AI decision explainability and data subject rights and the solutions presented by PLEAD for better explainability of AI decisions as well as the tools for GDPR compliance. |
Year(s) Of Engagement Activity | 2021 |
URL | https://www.cpdpconferences.org/schedule |
Description | To automate or not to automate compliance? |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Professional Practitioners |
Results and Impact | Invited talk during the Second Meeting of the NL4XAI Network Training Program "Law and Ethics in AI and NLP". Legal automation is usually seen with suspicion by lawyers and others. This is because substituting the law enforcement process with a mere technological or self-executing enforcement process is inherently problematic as opacity inevitably creeps in. In this talk, Sophie will explore the challenges and opportunities related to compliance automation and discuss the potential of computable explanations when decision-making is partially or fully automated. |
Year(s) Of Engagement Activity | 2020 |
URL | https://nl4xai.eu/ |
Description | Workshop in CPDP LatAm 2021 'Please Explain: the role of explainability and the future of automated compliance' |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Professional Practitioners |
Results and Impact | The workshop 'Please Explain: the role of explainability and the future of automated compliance' brought together an interdisciplinary mix of experts from Artificial Intelligence, Computer Science and Law to explore present challenges for AI making in Latin America and beyond. In the open discussion, the 100+ attendees had the opportunity to explore synergies between the regimes in Latin America and Europe regarding the regulation of AI and automated decision-making and to learn about emerging technologies for the explainability of AI decisions. |
Year(s) Of Engagement Activity | 2021 |
URL | https://www.youtube.com/watch?v=Ew2E01r1Nr0 |
Description | explAIn Technical Workshop: Exploring the links between Explainable AI, Causality and Persuasion |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Professional Practitioners |
Results and Impact | A show and tell workshop where participants present their approaches to explanations, followed by questions and discussions on the presented approach. Dong Huynh presented the PLEAD approach to explanations for automated decisions. Invitation to present more detailed work at the XAI Seminar series at Imperial College (http://xaiseminars.doc.ic.ac.uk). Given the wide variety of experts involved, we also discussed organising a Dagstuhl workshop on the topic. |
Year(s) Of Engagement Activity | 2021 |
URL | https://www.doc.ic.ac.uk/~afr114/explainAI21/index.html |