A Platform for Responsive Conversational Agents to Enhance Engagement and Disclosure (PRoCEED)

Lead Research Organisation: University of East Anglia
Department Name: Computing Sciences

Abstract

The way in which individuals interact with technology is rapidly evolving, as users increasingly expect fast, reliable and accurate information. In order to deliver systems capable of meeting these expectation both businesses and government departments alike are turning to conversational agents (or chatbots). These conversational agents are capable of interacting and engaging with users, answering user queries and even providing advice and guidance as required. This research considers how this technology can be optimised to provide a more effective method of communication, while also focusing on the implicit trust that a user has with a conversational agent.

As part of this research we will investigate the nature of sensitive information and how the context of the information can play a role in its perceived sensitivity. This will be achieved using a range of experiments to better understand the public's perceptions of personal information, and how those perceptions relate to the classification of the information.

In order to fully understand the use of conversational agents it is essential to properly understand the nature of personal, sensitive information and also their perceived trustworthiness. We will examine how different facets of a conversational agent's humanity, personality and appearance can be used to affect an individual's perceptions and trust in that agent.

We will focus on the use of conversational agents across three key sectors: healthcare, defence and security and technology. These three areas have been selected as they are significant users of conversational agents and all deal with potentially sensitive and personal information, as well as being areas of significant public spending. Our research will understand how these interactions between humans and computers can be optimised to deliver a bespoke conversational agent tailored to meet the expectations and needs of the individual. This in turn will increase the trust and confidence in these digital services.

Planned Impact

The proposed research will generate outputs that bring together trust, identity, privacy and security with human-computer interaction. Ultimately this research will aim to deliver a new way of measuring how individuals interact with conversational agents, which in turn will provide greater insight and understanding into their operation. This work has the potential to drive change across a range of user communities, providing a different way of thinking about interaction with conversational agents. This will result in a number of key changes:
1. Provide a deeper understanding of the nature of sensitive information and the affect that context has on the sensitivity of this information.
2. Bring together social scientists and computer scientists to consider this as a broader multi-disciplinary research problem and increase understanding of how social science methods can drive forward the usability and effectiveness of current technologies
3. Define the situational nature of information disclosure to illustrate how the context of a discussion can inform the sensitivity of the data.

This project will aim to deliver impact into four main beneficiary groups: end-users, owners of conversational agents, designers and developers of conversational agents and academia.

End-users: This is a significant societal impact as there is an increasing reliance on technology and a push towards digital services. This can often be alienating or difficult for a wide selection of the population and as such this work will focus on increasing trust, usability and acceptance of conversational agents. This will be achieved by developing a platform capable of dynamically identifying the conversational agent that is best suited to a specific user. This in effect will deliver a bespoke experience for each user, to ensure that they feel secure in giving honest and complete disclosures. For example, when considering our work with healthcare it is clear that honesty is imperative. Prof. Helen Dawes will provide excellent access to this community to deliver vital impact into the field.

Owners of conversational agents: The shift towards conversational agents allows organisations (commercial or government) to engage with more users than ever before. Our research will benefit a range of stakeholders in this group by defining methods to increase the trust and acceptance of these new technologies as well as improving the quality and depth of interactions between user and conversational agent. For example, we will work with our project partners such as Crest (see letter of support) and Prison Voicemail (see letter of support) to engage with a range of stakeholders.

Designers and developers of conversational agents: This work is intended to help drive forwards the current generation of conversational agents. At present the accepted way of deploying these agents is a 'one size fits all' approach. Our work will deliver significant impact to this community by creating a platform capable of adapting the personality of the conversational agent to maximise engagement and disclosure. This impact will be achieved using the team members links to the technology community and also through our project partners in the field Velmai Ltd (see letter of support). We aim to enable organisations, across a range of sectors in the digital economy, to better engage with their clients/customers, as well as substantially increasing user experience with agents.

Academia: Our inter-university consortium will support the training of highly skilled researchers and facilitate the generation and transfer of new methods of research synthesis. We will engage with our project partners Crest who have strong links throughout the academic community to ensure impact is delivered to a wide range of potential users.
 
Description A user's demographic information will impact the type of chatbot that they are comfortable to engage with, and this can be used to maximise honesty and openness
Exploitation Route This platform provides the basis for further investigation, and will enable the consideration of the wider impacts and issues surrounding data sharing with conversational agents
Sectors Digital/Communication/Information Technologies (including Software),Healthcare

 
Description The findings have been used to develop two board games that explore the idea of privacy and trust with chatbots alongside the wider issues of information sharing. These have been demonstrated at the Norwich Science Festival in February 2023. The aim of these activities is to increase understanding around the impacts of these issues.
First Year Of Impact 2022
Sector Education
Impact Types Societal

 
Title Wizard of Oz experimental framework 
Description The software enables the user to create and run wizard-of-oz style experiments to validate the responses to chatbots. 
Type Of Technology Webtool/Application 
Year Produced 2021 
Impact The software is currently being curated for sharing as open source software. 
 
Description Board games expo at Norwich Science Festival 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Schools
Results and Impact 150-200 people attended a workshop demonstrating board games in science. Two of these games were developed as part of this project. The games were designed to establish conversations about the importance of privacy and trust and the scenarios where data sharing is valuable.
Year(s) Of Engagement Activity 2023
 
Description Presentation at Norwich Science Festival 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact 50-60 members of the general public attended the talk, which was also broadcast on YouTube. The talk focused on the use of chatbots within technology and specifically their use to elicit sensitive information. There were a lot of follow up questions and the audience engaged well, following this the university has had an increased amount of requests for information from the media etc. on chatbots and related subjects.
Year(s) Of Engagement Activity 2021
 
Description Presentation on the project and research to School of Psychology 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach Local
Primary Audience Undergraduate students
Results and Impact 30-40 students from the School of Psychology attended a lecture about security, privacy and trust with a particular focus on this project. The aim of the talk was to engage students with the range of career paths and research in this field.
Year(s) Of Engagement Activity 2021