Emotional AI: Comparative Considerations for UK and Japan across Commercial, Political and Security Sectors

Lead Research Organisation: Bangor University
Department Name: Sch of Music & Media

Abstract

CONTEXT
Emotional AI comprises affective computing and AI techniques to sense human emotional life. Using weak AI rather than strong AI, machines read and react to emotions via text, images, voice, computer vision and biometric sensing.

Emotional AI is an emergent phenomenon appearing across diverse devices and life contexts. Examples include: city sensing such as ads that analyse faces for emotional reactions; analysis of sentiment on social media (used for policing, political campaigning, marketing); wearables that track moods of workers; and human-machine interaction (eg home assistants). A leading technology analyst, Gartner, predicts that by 2022 personal devices will know more about a person's emotional state than their family.

Emotional AI raises important social questions, esp. on how data about online feeling, bodily affects, physiological expression, attention and intention should be used. Harms include: privacy violations, unauthorised uses of data about emotions, visibility of private thoughts, hidden influence, lack of scope to opt-out of monitoring of emotional behaviour, and impact on self-conception of emotional identity.

AIMS/OBJECTIVES
To bring together in interdisciplinary fora 40 UK and Japanese academics, industry, NGOs and regulators to better understand the constitution, and commercial/civic/security implications, of a world where emotional AI is increasingly central. Through this network, we will identify pressing cross-cultural research projects on emotional AI, sensitive to culture, policy, ethics, technological trends and industrial context.

We will hold workshops in Japan (2 x 2-day) and London (1 x 1-day). Each involves inter-disciplinary academics, NGOs, policy-makers, industry and technologists (15-20 high value people/workshop drawn from Japan/UK):
- Japan Workshop 1 (Emotional AI - Commerce & Civic Life) will focus on the emergence of emotional AI, and its commercial and civic dimensions.
- Japan Workshop 2 (Emotional AI - Security & Civic Life) will examine the role of emotional (and wider) AI, in security, predictive policing and surveillance.
- UK Workshop 3 (Emotional AI & Cross-Cultural Considerations) will brief UK academics/stakeholders (incl. start-ups) on what has been learnt from our interactions in Japan, providing insights and cross-cultural considerations of emotional AI and its application to commercial, civic and security issues. This workshop will be held at Digital Catapult, the UK's leading agency for early adoption of advanced digital technologies.

Through the 3 workshops, we will meet stakeholders in Japan/UK who can be brought in on grant ideas.

Following the workshops, we will develop grant bids to UK funders. To achieve this, the UK-based Principle Investigator (PI) and Co-Investigator (Co-I) will spend 26 days in Japan; the UK-based Early Career Researcher (ECR) will spend 21 days in Japan; 2 other UK-based academics will each spend 6 days in Japan; and the Japan-based International Co-I (ICo-I) will spend 19 days in UK.

Deliverables:
- 3 workshops (2 in Japan, 1 in UK): Japan workshops will have simultaneous bilingual translation.
- New grant applications to AHRC, ESRC, EPSRC or Innovate UK with Japan ICo-Is and UK/Japan end users.
- Bilingual website to inform/attract participants/stakeholders.
- Bilingual blog entries written by workshop participants to provocations posed by workshop leaders.
- A written report on insights/recommendations.

APPLICATIONS/BENEFITS
We will:
- Raise UK/Japanese stakeholder awareness (technology industry, policy-makers, NGOs) of the need to understand, and be involved in, cross-cultural (UK-Japan) interdisciplinary research on emotional AI and its commercial/civic/security implications.
- Establish a deep collaborative UK-Japan interdisciplinary, multi-end user network on emotional AI, building cross-national research capacity.

Planned Impact

This project's key output is new grant applications. In devising these, we will embed co-created RQs, project ideas and impact pathways via end user participation in 3 Workshops. The team is experienced in such activities via PI/Co-Is' AHRC, ESRC, EPSRC and Arts Council grants, and ICo-I's grants from Japan Society for Promotion of Science (JSPS), Canada Arts Council (CAC) and Quebec Council for the Arts (QAC).

Ethical growth of the emotional AI sector requires meaningful engagement with ethnocentric social, normative, industrial and legal issues to promote beneficial over harmful applications. By raising these issues in our new network and embedding them in our grant applications, this project will positively impact the following stakeholders:
- Diverse companies involved in emotional AI (incl. CEOs, privacy officers, Data Protection Officers). UK engagement is assured by the PI's wide network in this area. UK companies active in emotional AI in the UK include the largest, e.g. IBM, to start-ups; and in Japan include Honda and Fujitsu.
- Policy makers who regulate data. In the UK this includes: Information Commissioners Office (ICO - the national data protection authority), Ofcom (media regulator) and advertising self-regulators (Committee of Advertising Practice, Internet Advertising Bureau). In Japan this includes the Personal Information Protection Commission (its data protection authority).
- Legislatures: From the UK, this includes House of Lords Committee on AI, House of Lords Joint Committee on Human Rights, House of Commons Digital, Culture, Media, Sport Committee (Fake News Inquiry) and a new Centre for Data Ethics & Innovation (set up in 2018 to enable ethical innovation in AI). From Japan this includes the New Energy and Industrial Technology Development Organisation that created its national AI strategy.
- NGOs: These include privacy-oriented groups. From UK: Open Rights Group (the PI is an advisor), Privacy International, Doteveryone, Don't Spy on Me and Big Brother Watch. From Japan: Privacy International Japan and local activists. We also include groups tackling disinformation online e.g. FullFact (UK), FactCheck Initiative (Japan).

Japan-based end users from industry, policy-making and NGOs will be engaged as follows:
- The team will identify Japan-based end users in the 6 months before the first workshops, inviting them to two 2-day workshops in Japan, one on 'Emotional AI - Commerce & Civic Life' that will present the big picture, and one on 'Emotional AI - Security and Civic Life' that will address emotional AI applications to security and citizenship. All end users from Japan will be invited to participate by providing position statements for the workshop, to be translated and published on the project's blog, and extracts published in the final report. Attendees' T&S for 3 nights/workshop will be reimbursed, to encourage participation in listening to research ideas (day 1) and developing research grant ideas (day 2).
- We will employ simultaneous translators for each workshop in Japan, and we will have the project website translated into Japanese to maximise participation and extend its reach among Japan's stakeholders.

UK-based end users from industry, policy-making, NGOs and legislatures will be engaged via the final workshop (one day) held at Digital Catapult, London, to maximise attendance. We will invite diverse UK end users (drawing on PI/Co-Is' extensive contacts from past research). E.g. the PI has attracted the biggest stakeholders in emotional AI at a past workshop at Digital Catapult on emotional AI ethics, incl. IBM, ICO and European regulators.

This project will benefit technology standards developers with global reach. The PI has excellent contacts at standards groups W3C, Internet Society and IEEE (e.g. the PI is on IEEE Working Group on values-based design for autonomous computing). The PI will share insights from the project via a final report and Skype meetings.
 
Description We generated conversation among UK and Japanese academics from multiple disciplines, as well as a range of end users from industry, artists, NGOs and regulators about emotional AI technologies in the cross-cultural context of UK and Japan. We generated specific research questions on the constitutional, commercial, civic and security implications of a world where emotional AI increasingly plays a central role. These included the following insights:
- Emotional AI: it is not just about emotions, but affect, states, intention and empathy.
- Current methods and models of emotion are questionable.
- Among industry delegates, there was desire for regulation and ethical guidance to provide business certainty.
- Emotional AI will augment other technologies/practices.
- There are multiple ethical concerns in UK and Japan, but different focus points.
- Japanese personal and social experience of emotional AI differs from the UK.
- Japan-UK social, political and legal contexts are substantively different (although arrangements with EU General Data Protection Regulation exist).
- Ethical "toolkits" that include law need to go beyond just compliance.
- Profiling of bodies in policing has a long history, and emotional AI could be the next focal point.
- Understanding of nuances and the specificities of the cultural context is key.
- As in the UK, public responses in Japan to the impacts of new technology on civic life vary greatly.
- We need to consider respect as well as privacy (in both Japan and the UK).
- We need to consider "metaphysical/spiritual" dimensions of cities (to avoid the cybernetic, "cold" conceptions of efficiency dominating discourse).
- On emotional AI and disinformation: Consider low political participation among youth in Japan.
We honed these into a multi-institutional UK-Japan large grant application submitted to ESRC and Japan Science and Technology Fund, that was awarded in Jan 2020.
Exploitation Route Creating more UK-Japan funding bids.
Sectors Aerospace, Defence and Marine,Communities and Social Services/Policy,Creative Economy,Digital/Communication/Information Technologies (including Software),Education,Environment,Healthcare,Government, Democracy and Justice,Security and Diplomacy,Transport

URL http://emotionalai.bangor.ac.uk/index.php.en
 
Description This award was given to run seminars in Japan and the UK for academics, policymakers, industry and security/defence sectors. We sought to gain insight into the consequences of emotional AI for these sectors, where social harms may lay, and where positive uses of these technologies may occur. Participants came from: New School for Social Research in New York, Ritsumeikan Asia Pacific University (APU), Bangor University, University of Edinburgh, Northumbria University, University of Cambridge, Keele University, Kyoto University, Freie Universität Berlin, Meiji University, University of Sheffield, Rikkyo University, University of British Columbia University of South Carolina, Chuo University, Japanese Ministry of Defence, Digital Catapult, Sensum, Nvidia, Sensing Feeling, Centre for Data Ethics and Innovation, Privacy International, Dyson School of Design Engineering, Internet of Things Privacy Forum, Coventry University, ITD-GBS Tokyo, Doshisha University, UK Cabinet Office, Ernst and Young, along with independent artists. As a networking grant, this project was about building relationships and capacity for a larger grant. Both were successful, with the subsequent grant generating rich and diverse impact. Here however the key impact is connection with the UK's Centre for Data Ethics and Innovation (CDEI, UK Government), who learned more about consequences (good and bad) of applications of emotional AI. This resulted in me advising the CDEI report 'Online Targeting' published in February 2020 (with me named and thanked). It also led to 2x interviews by the Cabinet Office on promoting humane technology and facial recognition technologies in February 2020; and, in July 2020. We (the research team) also advised the UK All-Party Parliamentary Group on AI regarding issues of governance, ethics and emotion recognition through an invited written briefing and a private call with its co-chair to discuss regulation of these emerging technologies.
First Year Of Impact 2020
Sector Government, Democracy and Justice
Impact Types Policy & public services

 
Description Emotional AI in Cities: Cross Cultural Lessons from UK and Japan on Designing for An Ethical Life
Amount £497,710 (GBP)
Funding ID ES/T00696X/1 
Organisation Economic and Social Research Council 
Sector Public
Country United Kingdom
Start 01/2020 
End 12/2022
 
Description Collaborative large grant application (awarded) between ESRC and Japan Science & Technology Fund onEmotional AI in Cities: Cross Cultural Lessons from UK and Japan on Designing for An Ethical Life 
Organisation Bangor University
Country United Kingdom 
Sector Academic/University 
PI Contribution Collaborative large grant application (awarded) between ESRC and Japan Science & Technology Fund, with investigators from Bangor University (McStay, Bakir), Edinburgh University (Urquhart), Northumbria University (Miranda), Ritsumeikan APU (Mantello, Ghotbi), Meiji University (Tanaka), Chuo University (Miyashita)
Collaborator Contribution Each is a Co-Investigator on the three-year grant
Impact multi-disciplinary: media, journalism, criminology, law, health, cultural studies, security studies
Start Year 2020
 
Description Collaborative large grant application (awarded) between ESRC and Japan Science & Technology Fund onEmotional AI in Cities: Cross Cultural Lessons from UK and Japan on Designing for An Ethical Life 
Organisation Chuo University
Country Japan 
Sector Academic/University 
PI Contribution Collaborative large grant application (awarded) between ESRC and Japan Science & Technology Fund, with investigators from Bangor University (McStay, Bakir), Edinburgh University (Urquhart), Northumbria University (Miranda), Ritsumeikan APU (Mantello, Ghotbi), Meiji University (Tanaka), Chuo University (Miyashita)
Collaborator Contribution Each is a Co-Investigator on the three-year grant
Impact multi-disciplinary: media, journalism, criminology, law, health, cultural studies, security studies
Start Year 2020
 
Description Collaborative large grant application (awarded) between ESRC and Japan Science & Technology Fund onEmotional AI in Cities: Cross Cultural Lessons from UK and Japan on Designing for An Ethical Life 
Organisation Meiji University
Country Japan 
Sector Academic/University 
PI Contribution Collaborative large grant application (awarded) between ESRC and Japan Science & Technology Fund, with investigators from Bangor University (McStay, Bakir), Edinburgh University (Urquhart), Northumbria University (Miranda), Ritsumeikan APU (Mantello, Ghotbi), Meiji University (Tanaka), Chuo University (Miyashita)
Collaborator Contribution Each is a Co-Investigator on the three-year grant
Impact multi-disciplinary: media, journalism, criminology, law, health, cultural studies, security studies
Start Year 2020
 
Description Collaborative large grant application (awarded) between ESRC and Japan Science & Technology Fund onEmotional AI in Cities: Cross Cultural Lessons from UK and Japan on Designing for An Ethical Life 
Organisation Northumbria University
Country United Kingdom 
Sector Academic/University 
PI Contribution Collaborative large grant application (awarded) between ESRC and Japan Science & Technology Fund, with investigators from Bangor University (McStay, Bakir), Edinburgh University (Urquhart), Northumbria University (Miranda), Ritsumeikan APU (Mantello, Ghotbi), Meiji University (Tanaka), Chuo University (Miyashita)
Collaborator Contribution Each is a Co-Investigator on the three-year grant
Impact multi-disciplinary: media, journalism, criminology, law, health, cultural studies, security studies
Start Year 2020
 
Description Collaborative large grant application (awarded) between ESRC and Japan Science & Technology Fund onEmotional AI in Cities: Cross Cultural Lessons from UK and Japan on Designing for An Ethical Life 
Organisation Ritsumeikan Asia Pacific University
Country Japan 
Sector Academic/University 
PI Contribution Collaborative large grant application (awarded) between ESRC and Japan Science & Technology Fund, with investigators from Bangor University (McStay, Bakir), Edinburgh University (Urquhart), Northumbria University (Miranda), Ritsumeikan APU (Mantello, Ghotbi), Meiji University (Tanaka), Chuo University (Miyashita)
Collaborator Contribution Each is a Co-Investigator on the three-year grant
Impact multi-disciplinary: media, journalism, criminology, law, health, cultural studies, security studies
Start Year 2020
 
Description Collaborative large grant application (awarded) between ESRC and Japan Science & Technology Fund onEmotional AI in Cities: Cross Cultural Lessons from UK and Japan on Designing for An Ethical Life 
Organisation University of Edinburgh
Country United Kingdom 
Sector Academic/University 
PI Contribution Collaborative large grant application (awarded) between ESRC and Japan Science & Technology Fund, with investigators from Bangor University (McStay, Bakir), Edinburgh University (Urquhart), Northumbria University (Miranda), Ritsumeikan APU (Mantello, Ghotbi), Meiji University (Tanaka), Chuo University (Miyashita)
Collaborator Contribution Each is a Co-Investigator on the three-year grant
Impact multi-disciplinary: media, journalism, criminology, law, health, cultural studies, security studies
Start Year 2020
 
Description workshop - London 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Policymakers/politicians
Results and Impact Emotional AI: Cross-Cultural Considerations (Workshop 3)

The final workshop reported project insights to UK stakeholders and generate UK-based insights for Japanese delegates, with view to building capacity for further knowledge exchange. Academics, policy makers, technologists, industry and the UK start-up community:

Discussed what emotional AI is (affective computing + machine learning/AI).
Considered use-case examples (led by companies working in this area)
Interactively explored ethical implications and opportunities for social good and harm.
Considered cross-cultural dimensions (having reflected on a pre-circulated report).
Utilised micro-workshops, each with a balance of technologists, ethicists, legal, policy, industry and NGO delegates.
Date: Monday 9th September 2019 (10am -3pm)
Location: Digital Catapult, London.
Reading for Delegates: Interim Report
Year(s) Of Engagement Activity 2019
URL http://emotionalai.bangor.ac.uk/workshops.php.en
 
Description workshops Tokyo 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Industry/Business
Results and Impact Emotional AI: Exploring the Impact on Commercial and Civic Life (Workshop 1)

By 'emotional AI', we refer to technologies that use behavioural, affective computing and AI techniques to sense, learn about and interact with human emotional life. The scope for application is extraordinarily diverse, ranging from use in advertising billboards in public spaces that react to facial expressions to societal adoption of home assistants and affect-sensitive robots in less public spaces. What conjoins these is use of technology to understand psycho-physiological emotional reactions and enable technologies to interact with people in novel ways. This workshop explored the potential for these technologies by focusing on social benefits and harms. Collectively, we will learn about the technologies, commercial applications in Japan and the UK, how citizens feel about them, why they feel this way, laws and governance that guide these technologies, and how we can have the best of these technologies and fewer harms.

Date: Tuesday July 9th, 2019 (10am - 4pm, lunch provided)
Location: Ritsumeikan Tokyo Campus, 8th floor, Sapia Tower.

Security and Policing: Predicting and Feeling with Emotional AI (Workshop 2)

The rapid growth of security technologies augmented by artificial intelligence that can assess and respond to human emotions is increasingly central to national security, border control and local law enforcement. Worldwide, companies such as Thales and Motorola sell biometric technologies to police forces around the world that track the emotional and psychological states of officers themselves. In Japan, companies such as Hitachi are developing artificial intelligence systems to better understand the emotions of civic groups as well as heighten security efforts at large-scale public spaces.This workshop explored key issues and challenges surrounding the deployment of emotional AI in local law enforcement, cyber-security, conflict zones and border control. Topics include: predictive policing; voice and facial recognition technologies at borders; autonomous military systems; the role of smart bots in manipulating/triggering user emotions in social media and their use in computational propaganda. The goal of this workshop is to better understand factors unique to Japan and the UK, discuss citizen reactions, address questions of ethics and privacy in both Japanese and British contexts.

Date: Friday July 12th, 2019 (10am - 4pm, lunch provided)
Location: Ritsumeikan Tokyo Campus, 8th floor, Sapia Tower.
Year(s) Of Engagement Activity 2019
URL http://emotionalai.bangor.ac.uk/workshops.php.en