Trust in User-generated Evidence: Analysing the Impact of Deepfakes on Accountability Processes for Human Rights Violations (TRUE)

Lead Research Organisation: Swansea University
Department Name: College of Law


User-generated evidence - defined as information recorded by an ordinary citizen and used in legal adjudication - plays an
increasingly important role in accountability processes. Across the world, advances in mobile phone technology and increasing
internet access mean that millions of important photographs and videos depicting mass human rights violations have been, and will
continue to be, created and shared online. Mass atrocity trials in Sweden, Germany, The Netherlands, and the International Criminal
Court, amongst others, have already utilised this kind of evidence, as have UN Human Rights Council-mandated commissions of
inquiry, fact-finding missions, and investigations. Yet, at the same time, the public is increasingly confronted with examples of
deepfakes - hyper-realistic images, videos, or audio recordings created using machine learning technology - which are only likely to
become more advanced and difficult to detect as the technology progresses. These two developments pose an important
conundrum: have perceptions of deepfakes led to a mistrust in user-generated evidence? And if so, what does that mean for the role
of such evidence in future human rights accountability processes? Much of the literature to date has expressed a concern that the rise
in deepfakes will lead to mass mistrust in user-generated evidence, and that this in turn will decrease its epistemic value in legal
proceedings. This may well be the case, but no study has yet tested that assumption. This is a major evidence gap that urgently needs
to be addressed. Through an innovative interdisciplinary methodology at the intersection of law, psychology, and linguistics, this
pioneering project will develop the first systematic account of trust in user-generated evidence, in the specific context of its use in
human rights accountability processes.


10 25 50