The ad hoc use of complex algorithms in the justice system
needs urgent oversight, the Law Society of England and Wales said
as it released the results of a year-long investigation.
The Law Society Technology and Law Policy Commission
publishes its report on algorithms in criminal justice alongside
an interactive map that allows the public to see for the first
time the beginnings of an overview of where algorithms are being
used to assist decision-making across the justice system across
England and Wales.
“Police, prisons and border forces are innovating in silos
to help them manage and use the vast
quantities of data they hold about people, places and events,”
said Law Society president Christina Blacklaws.
“Complex algorithms are
crunching data to help officials make judgement calls about all
sorts of things – from where to send a bobby on the beat to who
is at risk of being a victim or perpetrator of domestic violence;
who to pick out of a crowd, let out on parole or which visa
application to scrutinise.
“While there are obvious efficiency wins, there is a worrying lack of
oversight or framework to mitigate some hefty risks – of unlawful
deployment, of discrimination or bias that may be unwittingly
built in by an operator.
“These dangers are exacerbated by the absence of
transparency, centralised coordination or systematic
knowledge-sharing between public bodies. Although some forces are
open about their use of algorithms, this is by no means
uniform.”
The Law Society’s key
recommendations:
-
Oversight: A
legal framework for the use of complex algorithms in the
justice system. The lawful basis for the use of any algorithmic
systems must be clear and explicitly declared
-
Transparency: A national
register of algorithmic systems used by public bodies
-
Equality: The public sector
equality duty is applied to the use of algorithms in the
justice system
-
Human rights: Public bodies
must be able to explain what human rights are affected by any
complex algorithm they use
-
Human
judgement: There
must always be human management of complex algorithmic
systems
-
Accountability: Public
bodies must be able to explain how specific algorithms reach
specific decisions
-
Ownership: Public bodies
should own software rather than renting it from tech companies
and should manage all political design decisions
Christina Blacklaws added: “Within
the right framework algorithmic systems – whether facial
recognition technology, predictive policing or individual risk
assessment tools – can deliver a range of benefits in the justice
system, from efficiency and efficacy to accountability and
consistency.
“We need to build a consensus rooted in the rule of law,
which preserves human rights and equality, to deliver a trusted
and reliable justice system now and for the future.
“The Law Society is grateful to professors Sofia Olhede and
Sylvie Delacroix, whose expertise helped the Technology and Law Policy
Commission formulate
practical, specific recommendations for government and public
bodies to minimise the risks and maximise the benefits of complex
algorithms in the justice system.”
Ends
Notes to editors
INTERACTIVE MAP
See which police forces are using algorithms and for
what:https://www.lawsociety.org.uk/support-services/research-trends/algorithms-in-the-justice-system/
REPORT
Read the Law Society Technology and Law Policy Commission
report:https://www.lawsociety.org.uk/support-services/research-trends/algorithm-use-in-the-criminal-justice-system-report/