A Feminist Examination of the Applications of the Law on Non-Consensual Pornography to Nudifying Tools: a Comparative Socio-Legal Analysis

Lead Research Organisation: University of Exeter
Department Name: Law School

Abstract

The object of this research is to take a comparative and feminist approach to analyse and understand the extent to which the laws criminalizing non-consensual pornography are sufficient to prevent the victimization of women through the use of AI-driven nudifying tools.

The aim of this research would be to consider, using a feminist lens and a comparative legal analysis, how well equipped the law is to prevent women becoming victims of such technology-driven pornographic abuse and reflect on whether or how legal reform could address this new risk to women.

Presently in Canada, non-consensual pornography is criminalized under section 162.1 of the Criminal Code. However, it is currently unknown whether the courts would extend the definition of "intimate image" to those artificially created through deepfake pornography or nudifying tools, given that, at the time of writing, there has not been a case dealt with by the courts on the topic of deepfake pornography. However, the United Kingdom is presently on track to include the sharing of deepfake pornography (which includes images created with the use of nudifying tools) as an offence through enacting the Online Safety Bill (2022), currently making its way through the British Parliament. Australia was also relatively slow in implementing a law against non-consensual pornography.

A number of European countries, as noted by Sepec & Lango (2020), have prohibited the dissemination of non-consensual pornography, but use the concept of privacy to do so. These authors argue that laws against non-consensual pornography in many civil law jurisdictions are insufficient in dealing with deepfake pornography given the requirement of how the image must be obtained (e.g., when the victim is in a private location, like an apartment, away from public view). Indeed, many of the laws against non-consensual pornography are too specific to be adequately applied to deepfake pornography given the lingering question of whether using the body of a different person in the creation of deepfake pornography of the victim is a violation of the victim's privacy. Therefore, the choice in examining common law jurisdictions is due to the flexibility often created that must be dealt with by case law, and the novelty in examining the laws applicable to nudifying tools in common law jurisdictions. Additionally, the ability of the common law courts in interpreting and applying existing laws on revenge pornography raises the possibility that common law jurisdictions may be able to adapt more rapidly to the use of nudifying tools. In particular, it will be interesting to follow developments in this area after the enactment of the UK's Online Safety Bill.

By looking at both the legislation, its shortcomings and potential of how the law is applied in practice in the context of the powerful interests of software developers, this project will look at how a unified approach to law reform and enforcement could better prevent this new and insidious pornographic abuse.

All participants will participate voluntarily and give informed consent and be free to withdraw from the project. It is anticipated that all data will be anonymised and stored in encrypted format on the University of Exeter OneDrive for the duration of the project.

By bringing these elements of the project together this research will examine the adequacy of non-consensual pornography laws in common law jurisdictions in the context of AI developments and from the perspective of female victims in order to propose the necessary reform to protect women from further online abuse.

Publications

10 25 50

Studentship Projects

Project Reference Relationship Related To Start End Student Name
ES/P000630/1 01/10/2017 30/09/2027
2879870 Studentship ES/P000630/1 01/10/2023 30/09/2027 Courtney Jones