Emotional Artificial Intelligence and the Right to Privacy
Lead Research Organisation:
King's College London
Department Name: Dickson Poon School of Law Departments
Abstract
Emotional Artificial Intelligence (EAI) is a rapidly growing multi-billion-dollar industry that combines AI and big data technology and aims to detect emotions through biometric senses such as facial expressions, voices, eye movements, skin conductance, body temperature. Although it remains controversial whether emotions can be objectively understood or interpreted, EAI is already used in many sectors and for different purposes. The surveillance connected with EAI creates distinctive risks for emotional privacy that other technologies do not, since we cannot 'turn off' our faces to hide our emotional states, an essential part of our inner lives. Furthermore, not only does the monitoring of our emotions threaten our 'inner lives', but, to the extent that EAI might "know subjects better than subjects know themselves," surveillance, which often involves an external assigning of identity, might soon result in EAI effectively determining our emotions. In my thesis, I will explore the necessity of embracing human emotions as potentially private subject matter and address the implications of emotional privacy for the regulation of EAI. I will evaluate various legal theories of privacy to answer the questions of why emotional privacy is important, which values, such as dignity, autonomy, self-hood, can plausibly underpin emotional privacy, and I will Analyse how effective the current regulatory regime is in protecting the rights to emotional privacy and data.
Organisations
Studentship Projects
Project Reference | Relationship | Related To | Start | End | Student Name |
---|---|---|---|---|---|
ES/P000703/1 | 01/10/2017 | 30/09/2027 | |||
2745244 | Studentship | ES/P000703/1 | 01/10/2022 | 30/12/2024 | Emine Akar |