Not just 'how', but 'why' - Exploring the effects of explainability on software learnability

Lead Research Organisation: University College London
Department Name: UCL Interaction Centre

Abstract

Tutorials are tools that help users learn about a software. A number of different approaches can be adopted in the design of a tutorial. Some examples of popular approaches include, and can sometimes involve a combination of: step-by-step instructions, demonstrations, gamification, and exploratory learning. However, current approaches in tutorial design tend to focus on memorisation strategies, which may be insufficient in supporting user learning for complex software.
Helping users to develop accurate mental models may address this issue. Research from the domain of Explainable Artificial Intelligence (XAI) may provide relevant directions for investigation.
This proposal suggests exploring the way in which explainability could affect software learnability. This will be done via a digital variation of the question-asking protocol, comparing the effects of four different tutorials (control, baseline tutorial, explainability features only, and explainable tutorial) on user learning and performance.
Methods used will be mostly quantitative, since the primary aim of this project is to investigate the effects of explainability on software learnability, exploring measures such as time spent exploring other resources and numbers of queries. However, some qualitiative methods will be used for the testing the final variable, 'software learnability', because it will be important to assess the types of queries asked, providing greater insight into the types of information users believe they are missing.

Publications

10 25 50

Studentship Projects

Project Reference Relationship Related To Start End Student Name
EP/W522077/1 01/10/2021 31/03/2027
2715760 Studentship EP/W522077/1 01/10/2022 30/09/2026 Qing Xia