Judged by machines? The Impact of Artificial Intelligence Algorithms on the Legitimacy of the Court System

Lead Research Organisation: Queen Mary, University of London
Department Name: Politics


The research project addresses the rapidly increasing use of
technology in justice systems transnationally, with special
emphasis on anticipated changes in the criminal justice
system in England and Wales. This is a matter with global
significance, notable developments having occurred in
United States (1), Canada (2), and China (3). The UK
government view is that "[t]he justice system must embrace
new technologies and seize the opportunities of the digital
revolution" (4), including artificial intelligence (5). An
ambitious set of reforms has already been embarked on,
considered broader than in any other comparable jurisdiction
(6) and there are concerns that "the reform programme is
moving ahead [...] in the absence of primary legislation" (7).
In the context of this paper, current technology refers to a
variety of measures: policing tools such as Durham police's
HART (8) (which assists custody officers in deciding whether a
suspect should be released, kept in the cell, or made eligible
for a local rehabilitation programme), body worn video, online
courts (9), artificial intelligence and legal decision-making
tools. In each instance, computer algorithms are used to
substitute or guide human decision-making. Where actively
employed in the United States computer-aided judicial
decision-making has raised concerns (10) of bias and a lack
of transparency when the algorithm itself was ruled a trade
secret (11). The UK Independent Police Complaints
Commission has recognised a risk of over-reliance on body
worn video to the exclusion other sources, and that viewing
might modify officers' recollections of events (12).
The term 'automatic justice' has been used by Bowling et al
to describe the increasing automation and reduction of
human agency in the criminal justice system (13). A
compaction of processes has exacerbated the move toward
'actuarial' (14) or 'administrative' justice, with inherent risks.
In 2013, Zuboff recognised 'everything that can be
automated, will be automated' (15). Now we are moving to a
stage where the investigative application of technology also
goes to the realm of evidence sorting (16), blurring dividing
lines which have existed for centuries, while the risk of
entrenching "machine bias" is a real one (17).
Nissenbaum notes the parallels with the use of machines to
perform manual tasks, and common effects can be
identified: a shift of power, a loss of accountability, and a
potential for harm (18). There is often a loss of functional
integrity (19), which may well be accepted by policy makers
in order to achieve the first of those elements. In automating
certain processes within the justice system, decision-making
8 / 12
power shifts from regulated police and legal professionals
trained in professional ethics to inscrutable algorithms,
commissioned by management, which are not equipped to
raise objections as problems emerge and objectives change.
'Automation bias' (20) can lead to unquestioning reliance on
answers derived through the actions of an often imperfect,
but functioning technology, algorithm or system, which
'favours efficiency over due process safeguards' (21). As due
process changes to become almost unrecognisable, and
responsibility for decision-making becomes diffused, the
ability to effectively challenge decisions seems likely to be
obstructed (22).


10 25 50

Studentship Projects

Project Reference Relationship Related To Start End Student Name
ES/P000703/1 30/09/2017 29/09/2027
2322718 Studentship ES/P000703/1 30/09/2019 30/01/2024 Cari Jeraldine Hyde-Vaamonde