iSee: Intelligent Sharing of Explanation Experience by Users for Users

Lead Research Organisation: Robert Gordon University
Department Name: School of Comp Sci & Digital Media

Abstract

The iSee Project will show how users of Artificial Intelligence (AI) can capture, share and re-use their experiences of AI explanations with other users who have similar explanation needs.
To clarify this further, let us use the phrase 'explanation strategy' to refer collectively to algorithms and visualization methods for explaining the predictions of models that have been built by Machine Learning (ML). We recognise that such strategies can be foundational, of the kind found in the research literature. However, user needs are often multi-faceted, and real-world applications and different users can require composite strategies formed from combinations of the basic building blocks provided by one or more of the foundational strategies.
We hypothesise that an end-user's explanation experience (like a lot of other problem-solving experience), must contain implicit knowledge that was required to solve their explanation need such as the preferred strategy (foundational or composite) and, in the case of composites, the manner of combination. What we will provide is the necessary platform to capture experiences by enabling users to interact with, experiment with, and evaluate explanations. Experiences once captured can be reused, on the premise that similar user needs can be met with similar explanation strategies. They help reinforce strategies for given circumstances whilst others can expose cases where a suitable strategy has yet to be discovered.
Our proposal describes in detail how we will develop an ontology for describing a library of explanation strategies; develop measures to evaluate their applicability and suitability; and design a representation to capture experiences of using explanation strategies. We explain how the case-based reasoning (CBR) paradigm can be used to discover composites and thereafter reuse them through algorithms that implement the main steps of a CBR cycle (retrieve, re-use, revise and retain); and why CBR is well placed to promote best practice in explainable AI. We include a number of high-impact use cases, where we work with real-world users to co-design the representations and algorithms described above and to evaluate and validate our approach. Our proposal also gives one possible route by which companies could certify compliance with explainable AI regulations and guidelines.

Publications

10 25 50
 
Description A comprehensive ontology, iSeeOnto, has been created to capture data relevant to describe an explainable AI experience.
Published/view friendly version: https://w3id.org/iSeeOnto/explanationexperience
Public GitHub repo: https://github.com/isee4xai/iSeeOnto
Exploitation Route Several use cases are being planned with stakeholders in the applications of AI in radiology, and cyber-security in telecom
Sectors Digital/Communication/Information Technologies (including Software),Healthcare

URL https://isee4xai.com/category/news/
 
Description Current discussions with Total Energy are exploring how disagreement between alternative attribution explainers can be consolidated to avoid loss of trust in the XAI system. An alignment strategy has been developed. This is currently being evaluated with a view to integration with Total's anomaly detection system.
First Year Of Impact 2022
Sector Energy
Impact Types Policy & public services

 
Title DisCERN 
Description The DisCERN algorithm introduced in this paper is a case-based counter-factual explainer. Here counterfactuals are formed by replacing feature values from a nearest unlike neighbour (NUN) until an actionable change is observed. We show how widely adopted feature relevance-based explainers (i.e. LIME, SHAP), can inform DisCERN to identify the minimum subset of "actionable features". 
Type Of Material Computer model/algorithm 
Year Produced 2021 
Provided To Others? Yes  
Impact DisCERN is a novel approach to counterfactual discovery. It is currently the only XAI counterfactual approach to correct code vulnerabilities. 
URL https://ieeexplore.ieee.org/abstract/document/9643154
 
Title iSee Ontology 
Description The iSeeOnto ontologies are being developed as part of the iSee project 
Type Of Material Computer model/algorithm 
Year Produced 2021 
Provided To Others? Yes  
Impact formalisation of XAI methods and explanation experiences to promote reuse. 
URL https://github.com/isee4xai/iSeeOnto
 
Title iSee API 
Description The iSee API serves as the backbone of the entire iSee Platform, expertly managing core integrations with other iSee Services. It is also responsible for handling the logic necessary for the smooth operation of the iSee Cockpit, while simultaneously maintaining the database layer. 
Type Of Technology Software 
Year Produced 2023 
Open Source License? Yes  
Impact This effort facilitates the reusability of the iSee API and lays the groundwork for potential extensions in the future. 
URL https://github.com/isee4xai/iSeeAPI
 
Title iSee Cockpit 
Description The iSee Cockpit serves as a user-friendly, web-based dashboard that allows both design users and end users to seamlessly interact with the iSee Platform. This intuitive tool is currently in development as an integral component of the larger iSee project. 
Type Of Technology Software 
Year Produced 2023 
Open Source License? Yes  
Impact Enables reusability of the web-based dashboard for managing XAI methods and explanation experiences 
URL https://github.com/isee4xai/iSeeCockpit
 
Title iSee Dialog Manager 
Description iSee Dialogue Manager is the back-end that implements the interactive test environment of the iSee platform. It is a Behaviour Tree engine that runs a dialogue model. The interactions are modelled as a Behaviour Tree. 
Type Of Technology Software 
Year Produced 2023 
Open Source License? Yes  
Impact The community can reuse navigation related to Behaviour Trees in dialogues. 
URL https://github.com/isee4xai/iSeeDialogManager
 
Title iSee Onto API 
Description The iSee Onto API serves as a middleware connecting the iSee Ontology and iSee API. It utilises Apache Jena Fuseki server and SPARQL to retrieve information from the iSee Ontology. 
Type Of Technology Software 
Year Produced 2023 
Open Source License? Yes  
Impact The solution architecture and SPARQL queries outlined in this work have practical applications for future projects. This work demonstrates the development of a user-friendly ontology query API, which enhances its readability and ease of use. 
URL https://github.com/isee4xai/iSeeOntoAPI
 
Description Invited Talk - Reliability in AI workshop at 25th International Conference on Knowledge-Based and Intelligent Information and Engineering Systems 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Invited talk at the Reliability in AI workshop at 25th International Conference on Knowledge-Based and Intelligent Information and Engineering Systems, invited talk on Role of case-based reasoning for explainable AI.

Talk abstract:
Role of case-based reasoning for explainable AI
A right to obtain an explanation of the decision reached by a machine learning model is now an EU regulation. Different stakeholders may have different background knowledge, competencies and goals, thus requiring different kinds of interpretations and explanations. In this talk I will present an overview of explainable AI (XAI) methods with particular focus on the role of case-based reasoning (CBR) for XAI. Specifically, we will look at recent work in post-hoc exemplar-based explanations that use CBR for factual, near-factual and counterfactual explanations.
An alternative role of CBR involves reasoning with end-users' explanation experiences to enable the sharing and reusing of experiences by users for users.
Here I will present our initial work towards creating the iSee XAI experience reuse platform (https://isee4xai.com/) where our aim is to capture and reuse explanation experiences.
Year(s) Of Engagement Activity 2022
 
Description Invited Talk at NTNU, Trondheim 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Presented a seminar talk titled " iSee Project - Building the AI you trust"
This was as part of the AI Seminar hosted by Norwegian Open AI Lab and NorwAI - Norwegian Reseach Center for AI Innovation
Year(s) Of Engagement Activity 2022
URL https://www.ntnu.edu/kalender/detaljer/-/event/c4d5fddd-8314-317c-be96-71d482e35038
 
Description Organise the XCBR Workshop and Challenge 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Postgraduate students
Results and Impact Organised the XCBR Workshop that presented peer-reviewed preliminary work in XAI methods related to CBR.
XCBR Challenge invited research teams to contribute XAI methods to the iSee Explainer Library.
Year(s) Of Engagement Activity 2022
URL https://isee4xai.com/xcbr_challenge_2022/
 
Description Present CloodCBR at ICCBR 2022 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Presented a paper titled "Adapting Semantic Similarity Methods for Case-Based Reasoning in the Cloud"
Year(s) Of Engagement Activity 2022
URL https://iccbr2022.loria.fr/schedule/
 
Description Present DisCERN Algorithm at ICCBR 2023 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Presented the DisCERN algorithm for counterfactual discovery with an analysis of trade-off between sparsity and proximity
Year(s) Of Engagement Activity 2022
URL https://iccbr2022.loria.fr/schedule/
 
Description Present iSee Cockpit Demo at SGAI - XAI Workshop 2022 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Presented a demonstration of the iSee Cockpit tools for the industry and XAI practitioners.
Year(s) Of Engagement Activity 2022
URL http://www.bcs-sgai.org/ai2022/?section=workshops
 
Description Present iSeeOnto at SGAI - XAI Workshop 2022 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Presented a talk titled "Modelling explanation strategies and experiences with iSeeOnto".
Year(s) Of Engagement Activity 2022
URL http://www.bcs-sgai.org/ai2022/?section=workshops
 
Description SICSA XAI Workshop 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Postgraduate students
Results and Impact The SICSA XAI workshop accepted papers from the universities in SICA organisation. The workshop included invited talks and paper presentations which created a forum to share exciting research on methods targeting explanation of AI and ML systems. The workshop fostered connections among SICSA researchers interested in Explainable AI by highlighting and documenting promising approaches and encouraging further work. https://sites.google.com/view/sicsa-xai-workshop/call-for-papers
Year(s) Of Engagement Activity 2021
URL https://www.sicsa.ac.uk/events/ai-research-theme-sicsa-workshop-on-explainable-artificial-intelligen...
 
Description Talk at the IEEE ICTAI Conference 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Presented our DisCERN algorithm for counterfactual discovery.
Year(s) Of Engagement Activity 2021
URL https://ictai.computer.org/
 
Description Talk at the SGAI Workshop on AI and Cybersecurity 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Presented our work on counterfactual discovery for code vulnerability detection and correction.
Year(s) Of Engagement Activity 2021
URL https://sites.google.com/view/ai-cybersec-2021/programme?authuser=0
 
Description Talk at the SICSA XAI workshop 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Postgraduate students
Results and Impact Presented our work in discovering counterfactuals to explain the grade received by a student for a Module based on Campus Moodle interactions
Year(s) Of Engagement Activity 2021
URL http://ceur-ws.org/Vol-2894/
 
Description Workshop at the TFNetworkAutmn21 Conference 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Industry/Business
Results and Impact iSee Workshop at the TommyFlower Network Conference held two half-day sessions.
The topics discussed were "What do users want when explaining an AI system?" and "An Introduction to the iSee Project"
There were two co-creation, co-design activities
1. Co-design the iSee cockpit for different types of users and use cases
2. Understand the requirements for evaluating if an explanation experience is successful.
Participants consisted of BT employees from different departments and academic researchers.
Outcomes of the co-creation, co-design activities were consolidated and included in the next iteration of iSee cockpit and evaluation strategy development.
Year(s) Of Engagement Activity 2021
URL http://tommyflowersnetwork.blogspot.com/2021/08/isee-workshop-teams-up-with.html