Designing Conversational Assistants to Reduce Gender Bias

Lead Research Organisation: University of Strathclyde
Department Name: Faculty of Humanities and Social Science

Abstract

Abstracts are not currently available in GtR for all funded research. This is normally because the abstract was not required at the time of proposal submission, but may be because it included sensitive information such as personal details.

Publications

10 25 50
 
Description There is growing concern that artificial intelligence conversational agents (e.g., Siri, Alexa) reinforce gender stereotypes. Because little is known about social perceptions of conversational agents' voices, we investigated (1) the dimensions that underpin perceptions of these synthetic voices and (2) the role that acoustic parameters play in these perceptions. Study 1 (N = 504) found that perceptions of synthetic voices are underpinned by Valence and Dominance components similar to those previously reported for natural human stimuli and that the Dominance component was strongly and negatively related to voice pitch. Study 2 (N = 160) found that experimentally manipulating pitch in synthetic voices directly influenced dominance-related, but not valence-related, perceptions. Further work we have conducted has demonstrated that reported willingness to verbally abuse conversational assistants is highly correlated with scores on both the Valence and Dominance components (Study 3) and that perceptions of dominance-related traits differ markedly for male and voices assistants' voices (Study 4). Collectively, these results suggest that greater consideration of the role that voice pitch plays in dominance-related perceptions when designing conversational agents may be an effective method for controlling stereotypic perceptions of their voices and the downstream consequences of those perceptions.
Exploitation Route Our findings highlight the key role voice pitch plays in stereotypic perceptions of conversational agents. They also suggest hat greater consideration of the role that voice pitch plays in dominance-related perceptions when designing conversational agents may be an effective method for controlling both stereotypic perceptions of their voices and the downstream consequences of those perceptions.
Sectors Digital/Communication/Information Technologies (including Software),Education,Culture, Heritage, Museums and Collections

 
Title Do people perceive male and female artificial intelligence (AI) conversational agents' voices differently? 
Description Trait ratings of AI conversational assistant's voices. 
Type Of Material Database/Collection of data 
Year Produced 2023 
Provided To Others? Yes  
Impact Basis of preprint. 
URL https://osf.io/dxzk7/
 
Title The role of valence, dominance, and pitch in social perceptions of artificial intelligence (AI) conversational agents' voices 
Description Data from first two studies conducted for "The role of valence, dominance, and pitch in social perceptions of artificial intelligence (AI) conversational agents' voices" 
Type Of Material Database/Collection of data 
Year Produced 2022 
Provided To Others? Yes  
Impact First dataset to document the dimensions and acoustic correlates underpinning social perceptions of conversational agents' voices. 
URL https://osf.io/4zgrf/
 
Description Designing conversational assistants to reduce gender bias 
Form Of Engagement Activity A magazine, newsletter or online publication
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Other audiences
Results and Impact Blog post on UKRI website
Year(s) Of Engagement Activity 2022
URL https://www.ukri.org/blog/designing-conversational-assistants-to-reduce-gender-bias/
 
Description Gendering AI: The case of conversational assistants 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact Interactive online debate held as part of the Edinburgh Science Festival.
Year(s) Of Engagement Activity 2021