Sir John Whittingdale (Maldon) (Con) I beg to move, That this House
has considered police use of live facial recognition technology. It
is a pleasure to serve under your chairmanship, Dame Siobhain. I am
grateful for this opportunity to debate the police's use of live
facial recognition technology. I have to say that this debate is
somewhat overdue. Any fan of Hollywood movies would think that the
use of facial recognition technology is widespread, as in
“The...Request free trial
Sir (Maldon) (Con)
I beg to move,
That this House has considered police use of live facial
recognition technology.
It is a pleasure to serve under your chairmanship, Dame Siobhain.
I am grateful for this opportunity to debate the police's use of
live facial recognition technology. I have to say that this
debate is somewhat overdue.
Any fan of Hollywood movies would think that the use of facial
recognition technology is widespread, as in “The Bourne
Ultimatum” and “Spooks”, and that it is commonplace for MI5 and
the CIA to tap into CCTV cameras across London. I do not believe
that is correct— I hope it is not—but police forces are using
facial recognition technology more and more. It was first used in
2017, and it is now commonly used by the Metropolitan
police, South Wales Police and
now my own police force in Essex, which purchased two vans in
August and use it regularly.
On 4 October, I accompanied police officers on a deployment in
Chelmsford High Street, who were hugely helpful in explaining to
me exactly how they use the technology and, importantly, what
controls are in place. They told me that they had a watch list of
639 individuals who had been approved by the superintendent and
were wanted for questioning in relation to offences such as
violence against the person. They included people with
outstanding warrants, suspects linked to county lines, suspected
shoplifters in that particular part of the county, and those with
a sexual harm prevention order.
In the course of the 30 minutes or so that I spent with those
officers, they recorded 1,500 faces of people who passed by. The
officers assured me that those images were matched against the
watch list to see whether they registered a positive, and if they
did not they were deleted in less than half a second. During the
time I was there, there were approximately 10 positives, which
led to a conversation: a police officer would go and have a
polite exchange to find out why the person had registered
positive, and they were checked against the Police National
Computer or Athena. That morning, that led to two arrests.
The chief constable of Essex has written to me and colleagues to
emphasise the effectiveness of the technology and its importance
to that force. He told me that they had so far had 25 deployments
across Essex, resulting in 26 arrests and 26 other positive
disposals. He said:
“This cutting-edge technology has enabled us to keep the public
safe, and can save time and effort of our front-line, allowing
them to do other work to protect and support the community.”
(Liverpool Riverside)
(Lab)
I thank the right hon. Gentleman for securing this important
debate. There are suggestions that this technology
disproportionately misidentifies black people and people from
other communities. Does he agree that the Government must give us
more assurances and ensure that more black people are not
criminalised? We know that black communities are over-policed and
underserved.
Sir
I certainly agree that more assurances need to be given. That is
actually one of the purposes behind requesting this debate. The
hon. Lady is right that concerns have been expressed—
(Goole and Pocklington)
(Con)
I agree with the hon. Member for Liverpool Riverside (), but it goes deeper than
that. There are at least three conditions that ought to apply,
and I would be interested to hear from my right hon. Friend the
Member for Maldon (Sir ) whether Essex met them.
First, these things always ought to be under judicial oversight;
it should not simply be a police decision. Secondly, as he said,
only the records of presumed guilty or actively sought people
should be kept and, thirdly, that innocent people's records
should be destroyed straightaway. That should not be left to a
guideline; it should be under legislative control and properly
treated in that way.
Sir
I agree with my right hon. Friend. The problem at the moment is
that we do not even have national guidelines. There is a complete
absence, which I will come to later. I will give way to the
shadow Home Secretary.
(Croydon South) (Con)
I am extremely grateful to my right hon. Friend for giving way. I
would like to add some context to the question of racial bias.
There were allegations of racial bias a few years ago. The system
was tested by the national physical laboratory about two years
ago and, at the settings used by the police, no racial bias was
found. That was one of the conditions set in the Bridges
litigation about four years ago, and I hope that gives my right
hon. Friend and other hon. Members some reassurance on the
question of racial bias. It has been tested by the national
physical laboratory.
Sir
As I understand it, the number of false positives recorded
depends to some extent on the threshold at which the technology
is set.
(Brent East) (Lab)
The report by the national physical laboratory said that it had
to be set at 0.6 for it to have fewer misidentifications, but
there is no such thing as no misidentifications or people not
being wrongly identified. It is also easy for a police service to
lower that number. Because we have no judicial oversight, it is
very problematic.
Sir
The hon. Lady is completely right. I think the police are
generally being responsible in its use and setting the threshold
as recommended, but that is another example where there is no
requirement on them to do so, and they could lower it. Regarding
deployment in Essex, the chief constable told me there was just
one false positive.
I attended a meeting with , along with my right
hon. Friend the Member for Goole and Pocklington, where Shaun
Thompson, an anti-knife community worker, spoke to us. He had
been held by the police for 30 minutes and forced to provide all
sorts of identity documents, as a result of a false positive. On
the extent to which it is occurring and whether racial bias is
involved, there is some evidence that that is the case. That
makes it all the more important that we provide assurances.
We have heard from several campaign organisations that are
concerned about the use. They vary in the extent to which they
believe it is a legitimate technology. Big Brother Watch has
described live facial recognition technology as
“constant generalised surveillance”
and has said that it is
“indiscriminately subjecting members of the public to mass
identity checks”
which undermines the presumption of innocence.
Liberty has gone further, saying:
“Creating law to govern police and private company use…will not
solve the human rights concerns or the tech's inbuilt
discrimination…The only solution is to ban it.”
I do not agree with that, because I think there is clear evidence
that it has a real benefit in helping the police apprehend people
who are wanted for serious offences, but one of my major concerns
is the lack of any clarity in law about how it should be
used.
I am grateful to the Library, which has provided advice on that
point. It says:
“There is no dedicated legislation in the UK on the use of facial
recognition technologies.”
Instead, its use is governed by common law and by an
interpretation of the Police and Criminal Evidence Act 1984,
although that Act does not mention live facial recognition
technology, and some case law, such as the Bridges case. Even in
the Bridges case, the Court of Appeal found that
“The current policies do not sufficiently set out the terms on
which discretionary powers can be exercised by the police and for
that reason do not have the necessary quality of law.”
On precisely that point, some police forces in the UK take the
view that GDPR has reach in this area. Does my right hon. Friend
have a view on that?
Sir
My right hon. Friend has anticipated my next point extremely
effectively. I was Minister at the time of the passage of the
Data Protection and Digital Information Bill, which did not cover
live facial recognition technology. At the same time, my right
hon. Friend the Member for Croydon South (), who is the shadow Home
Secretary, was the Minister for Policing and he made a speech
about how valuable live facial recognition technology was. I
therefore sought advice about how that fitted in with GDPR.
The advice that came back following consultation with the
Information Commissioner's Office was that there is no blanket
approval by the ICO for the use of LFR technology. Essentially,
it should be judged on a case-by-case basis, but the ICO had
expectations that data protection and privacy should be
respected. It went on to say that the use of LFR can be highly
intrusive and future uses of the technology may require updates,
but that the ICO is monitoring it closely. That is only partially
reassuring. Essentially, the ICO recognises that breaches of data
protection could be possible, and is monitoring it, but there is
no clear guideline to assist the police or anybody else with
precisely how it should be used.
I am grateful to legal consultants Handley Gill, who wrote to me
yesterday and who are involved in advising a number of people
about the legality of the technology. They said that
“it is undesirable for individual Chief Officers and PCCs to have
to engage in the wide ranging review and preparation of the
necessary documentation, and that a move toward a common national
approach (and choice of technology provider) would secure
efficiencies and also enable closer monitoring…to ensure their
efficacy and lawfulness.”
Although we are no longer bound by European Union law, the EU has
brought in much more stringent controls than exist here.
McAllister (West Dunbartonshire)
(Lab)
Scotland's chief constable said in September that it would be “an
abdication” of her duty not to assess whether this AI tool could
be used and that the force was “very much alive” to it,
describing it as a crucial tool to “take violent perpetrators”
off the streets. In my view, it is an exercise in balancing the
need to tackle crime and keep people safe with the impact the
tool may have on human rights and civil liberties. I believe the
right hon. Gentleman wishes to introduce stringent restrictions
on the use of such surveillance. If so, what are they, and is he
seeking to follow similar European states' legislation akin to
the EU Artificial Intelligence Act?
Sir
The EU's AIA lays down very strong controls—it almost goes too
far—in that it restricts the categories of individual who can be
sought under the watch list to quite a small number. The House of
Commons Library points out that
“the AIA 2024 prohibits the use of ‘real-time remote biometric
identification systems' (such as LFR) in publicly accessible
spaces for the purposes of law enforcement, unless such use is
‘strictly necessary' for one of the following objectives”.
The list it provides includes the search for specific victims of
abduction or trafficking; missing persons; the prevention of a
substantial and imminent threat to life; the prevention of a
genuine threat of a terrorist attack; or the localisation of a
person suspected of having committed a criminal offence.
In Europe, the controls are strong, but in this country it is
left largely to police officers to interpret the law and be
reasonably confident. However, legal challenges are under way.
Shaun Thompson, whom I met, is seeking judicial review of the
police's actions and the campaign organisations are also looking
at legal challenges. There is a real need for clarity. Certainly,
the sergeant of Essex police who is in charge of deployment told
me that, in his view, it would be really helpful for the police
to have clear guidelines. They would then not have to make those
difficult decisions and could potentially satisfy a court that
the use was proportionate and justified.
As far as I am aware, this matter has not been debated by
Parliament before, and it should have been because there is a
real need to seek clarity in the law. This may sound like science
fiction, but ultimately there is a risk that it becomes possible
for every CCTV camera in the country to be linked up, and there
could be a watchlist of not 600 but millions of people. Concerns
have been expressed by organisations such as Big Brother Watch—in
this particular instance, that organisation could be well
named—and I do not think any Member would wish to go down that
route. I think most people recognise that there is some value in
the technology, but there is a need for clarity. I am grateful to
the shadow Home Secretary and particularly the Minister for
Policing for coming to contribute, and I look forward to what
they have to say.
Dame (in the Chair)
I remind Members that they should bob if they wish to be called
in the debate, and it seems that everybody does.
9.46am
(Brent East) (Lab)
It is a pleasure to serve under your chairmanship, Dame Siobhain.
I thank the right hon. Member for Maldon (Sir ) —or I could say my right
hon. Friend, if he does not mind—for securing this debate. I have
spoken to the Secretary of State and Ministers in the Department
for Science, Innovation and Technology, and there is an awareness
that we need a lot of careful and considerate thinking on this
issue. Obviously, a new Government have just come in and this is
not a new issue, as the right hon. Member for Maldon said—LFR was
first used in 2017, so there is a lot of clearing up that has to
be done.
Live facial recognition changes one of the cornerstones of our
democracy: an individual is innocent until proven guilty. With
this technology, if the machine says an individual is guilty
because they have been identified using live facial recognition,
they then have to prove their innocence. That is a huge change in
our democracy that nobody has consented to. We have not consented
to it in this place, and as we police by consent as a society,
that should really worry us all.
I thank the hon. Lady for giving way; I am looking forward to
this debate and to concluding it for the Opposition later.
On the question of changing the burden of proof or undermining
the concept that someone is innocent until proven guilty, the
technology absolutely does not change that. What it does is give
the police a reason to stop somebody and check their identity to
see whether they are the person wanted for a criminal offence. It
certainly does not provide evidence on which a conviction might
be secured. In fact, it is no different from the police stopping
someone because they are suspicious of them, and it is a lot more
accurate than stop and search, about which I am sure the hon.
Lady has views. It is simply a tool to enable the police to stop
somebody and check their identity to see whether they are the
person who is wanted. It certainly does not undermine the very
important principle that a person is innocent until proven
guilty.
The shadow Minister has hit on an important point regarding
reasonable suspicion. What is reasonable suspicion? How have the
police got to that point? If he is then going to make reference
to watchlists, who is put on a watchlist? We know, for instance,
that the Met police has hundreds of thousands of people on its
system who should not be there. We know that the watchlist can
consist of people it considers to be vulnerable, such as those
with mental health issues. Anybody in this room could be put on a
watchlist, so I am afraid the shadow Minister has not quite
nailed the point he was trying to make.
I am very much on the hon. Lady's side of that argument, partly
because we are a country where it is not normal to stop people
and ask for their identity cards, which is why we have had a few
battles over that in the past. Also, the technology is prone to
slippage. Way back when—probably when the hon. Lady was still at
school—we introduced automatic number plate recognition to
monitor IRA terrorists coming from Liverpool to London. That was
its exact purpose, but thereafter it got used for a dozen other
things, without any legislative change or any approval by
Parliament.
Dame (in the Chair)
Order. Could I ask Members to keep interventions as
interventions?
Thank you, Dame Siobhain. Yes, it is really important that we
talk about this openly. That is what we are supposed to do in
this place, right? Anybody can be put on the watchlist. Seven
police forces are currently using LFR. One that I know of—I am
not sure about the others—the Metropolitan Police Service, is in
special measures. I do not think it should be given any
additional powers while it is in special measures.
The thing is that we know very little about the software or what
is in the black box that is developed by these systems. What we
can look at is the outcome, and we know that the outcome does not
identify very well black women's faces, especially, and black and
Asian people. There is a lower identification threshold for those
people, so that is a concern.
It is also really interesting that even when LFR is set at 0.6, a
police super-spotter is more accurate. We have specialist police
officers who spot people very quickly, and they are more accurate
than this system, so it becomes the case that a police service
will try to prove that the system it has bought is value for
money. We can imagine a police officer not getting many hits with
LFR at 0.6 and lowering that to 0.5 so that they can get more
hits, which in turn means that more people are misidentified, so
there should be regulation around this issue.
Taking away somebody's liberty is one of the most serious things
we can do in society, so we need to think very carefully if we
are going to introduce something that accelerates that. It is
good that for the first time we are having the debate on this
issue. As the right hon. Member for Maldon said, the EU permits
LFR only where there is prior judicial authorisation and in cases
in which the police need to locate a missing person, for
instance. That is something we need to consider.
I want to say this: I like technology. I am very much into our
civil liberties. We need to protect our digital rights as human
beings and individuals. I love technology— I used to be a
coder—but we should not rush to do things because people get
excited. There are really four people in the debate on this
issue. It reminds me of four of my mates when we go out clubbing.
Bear with me. We have the person who will stay at home because
they are not bothered—they do not care—and we have the people who
do not care about this issue: “It is going to happen; let it
happen.” We have the person who will come, but they are a bit
moany. They do not really like the music, but they will come
anyway because they do not want to miss out.
We then have the person who is completely drunk on it all: “Give
it to me. I'll take everything.” There are people who just love
anything to do with technology and will say, “Look, let's just
throw it all in the mix and it'll all be fine.” And there is me.
I am the person who likes the music and the food, but I need to
keep sober to make sure everyone gets home safely. In this debate
about AI, we need to be sober to make sure that everybody gets
home safely and that when we roll out AI, we do so in a way that
is fair and compassionate and in line with our values as British
citizens.
9.53am
(Strangford) (DUP)
It is a real pleasure to speak in this debate on live facial
recognition technology, and I thank the right hon. Member for
Maldon (Sir ) for leading it.
I have to make a confession to the House: I am not technically
minded. I can just about use my phone for text messages; I cannot
do much else with it. When it comes to TikTok, Facebook, X and
all those other things, I am not even sure what they all are. The
fact is that my staff do all that, so anything that people see on
there from me is because of them. I okay it, but they put it
out.
But even if I am not technically minded, I understand the
necessity to have technological advances in place and that they
can also be used to benefit our police and criminal justice
system. I am fully in support of advancements where there is
necessity and reason for them, but the hon. Member for Brent East
() was right to identify some
problems with the system. So my contribution will be in favour of
facial recognition technology, but also focused on the need to
have a system that does not infringe on human rights.
Mr (East Londonderry)
(DUP)
Does my hon. Friend agree that our concern for the wider
population and individual safety has to be paramount? Allied with
that are the necessary safeguards that have to be built in so
that safety does not rule out and infringe on the personal
liberties of people who have not done anything wrong and are
unlikely to do so.
I agree with my hon. Friend and that point is the thrust of my
contribution.
It was incredibly helpful to hear the comments of the right hon.
Member for Maldon, and about how he was able to join police
forces to see how live facial recognition works. I understand
that was the 13th use of the technology by Essex police, with it
having been deployed previously in Harlow, Southend and Clacton.
Essentially, the equipment works by scanning the faces of all
individuals seen by a camera and comparing them to a
predetermined watchlist.
As my hon. Friend the Member for East Londonderry (Mr Campbell)
said, safety is paramount—that is the critical reason for using
the technology. I speak on human rights issues all the time, as
many present will know. I want to make sure that when we have
technology in place, human rights are not abused or
disenfranchised, and that people do not feel threatened. Innocent
people should never feel threatened, of course, but there are
those who have concerns. The technology has already proven itself
and led to a number of arrests of people wanted for serious
offences such as sexual abuse, domestic violence, aggravated
burglary and shoplifting.
I will make a quick comment about the Police Service of Northern
Ireland and what we are doing back home. A freedom of information
request was submitted to the PSNI in late 2022, and it was
concluded that live facial recognition is not currently used in
Northern Ireland. I was aware of what the hon. Member for West
Dunbartonshire ( McAllister) said when he intervened
earlier, because Northen Ireland is in the same place on this.
The FOI concluded that it is the intention of the PSNI to explore
fully the potential of facial recognition technology, and that a
working group was to be established in late 2022, in conjunction
with PSNI stakeholders. The principle of why the technology is
necessary is already in place, but we need to have the safeguards
as well.
Last week I was in a Westminster Hall debate secured by the hon.
Member for North Down () on the importance of funding
for local policing. There are clear examples across the United
Kingdom that show that live facial recognition works and is
extremely beneficial to the prevention of crime and for
convictions. Perhaps, then, it is something that could be funded
through the Barnett consequential. The Government will tell us
that they have set funds aside, and we thank them for the extra
money for Northern Ireland, but if it can help the police forces,
that needs to be looked at.
Numerous concerns have been raised about the use of LFR by our
police forces. Surveys have revealed that the British public are
mainly concerned with policy infringements, surveillance, consent
and the unethical use of facial recognition by the police. The
hon. Member for Brent East said that clearly in her contribution,
as have others. Police officers shared concerns that there could
potentially be impacts on the legal and human rights of
citizens.
I will always speak out on human rights abuses where they are
highlighted and where infringements take place. It is good to see
the Minister in her place; we all have an incredible respect for
her and I look forward to her contribution. I seek to hear from
her how human rights can be assured and carefully covered. The
invasion of liberty and privacy are of major concern. If the
technology is to be widely used across police forces, there must
be assurances on public safety.
Concerns about false positives have been raised. I do not pretend
to understand the technology, but others have explained that if
it is turned down from 0.6 to 0.5, it offers a wider spectrum of
people. That can cause such damage to people and their
reputations, and reputation is everything for many people. Should
this be trialled in the likes of Northern Ireland or Scotland, we
must have assurances that the algorithms are correct and that
they identify people correctly. I support the technology with
that proviso.10.00am
(Clapham and Brixton
Hill) (Lab)
Thank you, Dame Siobhain, for your merciful chairpersonship. I
thank the right hon. Member for Maldon (Sir ) for introducing this
crucial debate.
Like many others, I have many concerns about live facial
recognition technology, some of which have already been raised,
but I will focus my remarks on the room for error and the
potential impact that this technology will have on already
dwindling public trust in police, particularly among black, Asian
and ethnic minority citizens. I will raise points similar to
those of my hon. Friends the Members for Liverpool Riverside
() and Brent East ().
Live facial recognition technology compares live CCTV images with
those already on the police database and other images taken from
open source, publicly available image sites. This is a deeply
flawed plan that could result in serious mix-ups. A simple
mislabelling on an image database could lead to the wrong person
being stopped and a potentially traumatic experience with the
police.
I can illustrate my point with a short anecdote; this happened to
me a mere few months after I was elected to this House. My hon.
Friend the Member for Battersea () was speaking in the
Chamber. BBC Parliament miscaptioned her as my hon. Friend the
Member for Brent East and, when they spotted this, both Members
took to Twitter to point out the mistake. In their haste to cover
the story, the Evening Standard incorrectly used a picture of me
instead of my hon. Friend the Member for Battersea—I hope
everybody is following this—and in its apology to all three of
us, it suggested that Getty Images, where they had taken the
image from, had labelled most of the pictures of me, since I had
been elected, with the name of my hon. Friend the Member for
Battersea. Since then, to avoid embarrassment, it seems that most
publications now use pictures of me looking like a constipated
walrus, but they have said that their reason for this is that
they can be sure it is me and they want to avoid any further
embarrassment.
Although problematic, that is a far more trivial example of what
can happen when images are mislabelled, but if humans can make
these errors, the technologies they create obviously can. If
online sources are going to be used as part of the image
database, it is almost inevitable that images will be mislabelled
and that innocent people will be subject to needless run-ins with
the police.
Questions around the numerical similarity score used to determine
matches also ought to be raised. We already know that facial
recognition data has racial bias: it is deeply flawed when
attempting to identify people with darker skin tones, just as
Getty Images is, and the Metropolitan police's own testing of its
facial recognition algorithm identified disproportionately higher
inaccuracy rates when attempting to identify people of colour and
women.
People of colour are already disproportionately stopped and
searched at higher rates, and the use of potentially flawed
technology will serve only to increase the rate at which ethnic
minorities are stopped, searched and possibly even incorrectly
detained, further dampening trust in the police among these
communities. We know that that needs to be resolved. To any
Member who thinks that I am exaggerating the potential for
misidentification, I say this: in 2023, Big Brother Watch found
that over 89% of UK police facial recognition alerts wrongly
identified members of the public as people of interest. In that
case, what benefits does this technology bring? It has been used
in the borough of Lambeth, including in my own constituency, on a
number of occasions, but as far as I am aware it has not produced
a substantial number of results. Our constituents are effectively
being placed under constant surveillance. The notion of their
presumed innocence, which sits at the heart of our justice
system, has been undermined, and this “cutting-edge” technology
has not produced substantial results.
With some 6 million CCTV cameras in the UK, which all have the
potential to be converted into facial recognition cameras, we are
veering dangerously close to becoming a police state with levels
of surveillance that would be deemed acceptable only in the most
authoritarian of dictatorships. I believe that our liberty and
our security can co-exist. It is not a matter of “those who have
nothing to hide have nothing to fear”; it is a matter of the
basic principles of freedom and privacy. Those basic principles
begin to draw into question what such surveillance is really here
for. Is it here to keep us safe or to monitor us 24/7?
Most Members would undoubtedly, I hope, protest at the idea of
police randomly stopping members of the public to check their
fingerprints or other DNA against databases just for a possible
match. Why should we look at this intrusive automated biometric
software any differently?
10.05am
(Leicester South) (Ind)
It is a real pleasure to speak under your chairmanship, Dame
Siobhain. I congratulate the right hon. Member for Maldon (Sir
) on securing this debate
and agree with him wholeheartedly that this issue should be
considered further in the main Chamber.
It is said that technology is a very useful servant but can be a
very dangerous master. Many colleagues have already made a robust
case for the use of this technology and undeniably it can be very
useful. However, I am extremely concerned and believe that we
must proceed with caution. In Leicester, some people already want
to use the technology, but we must ensure that there is
watertight legislation before we proceed any further.
Among my main concerns is the accuracy of the technology. We must
ask whether it is fit for purpose. A spokesperson from StopWatch,
a UK coalition of academics, lawyers and activists, has said
that
“there is very little evidence on the efficacy of LFR
deployments”.
In fact, in the first six months of this year, when this
technology was deployed, StopWatch found that on average it
stopped one person nearly every hour, or every 55 minutes, and
that a person was arrested every two hours because of it. The
data showed that, as the hon. Member for Clapham and Brixton Hill
() said, over 80% of those
arrests were unnecessary. The right hon. Member for Maldon said
that the police have polite conversations with people, but polite
conversations have a different meaning for different people.
Secondly, there is equality and non-discrimination. We already
know that a black person is four times more likely to be stopped
by this technology, as we are now. The technology has been shown
to exacerbate any racial profiling. In fact, it has been
demonstrated that it disproportionately misidentifies women,
people of colour and even disabled people. That is a real
concern.
Thirdly, as the majority of colleagues have already mentioned,
the technology is an attack on our civil liberties. Earlier this
year, the European Court of Human Rights ruled against Russia
after claims that it had used LFR technology to locate and arrest
a protester on the Moscow metro system. That is extremely
frightening. Similarly, China has been accused of perfecting a
version of facial technology that can single out and track
Uyghurs—members of the repressed Muslim community in China.
We must acknowledge these concerns and ensure that, like the EU,
we have in place stringent legislation, like the EU's Artificial
Intelligence Act 2024, before this technology becomes widely used
and turns into our master.
10.08am
(Brighton Pavilion) (Green)
I thank every Member here for coming to this debate and I thank
the right hon. Member for Maldon (Sir ) for securing it in the
first place.
I have worked on this issue for many years. In my previous job, I
attended and observed the first deployments of live facial
recognition by the Metropolitan police, which is many years ago
now. Since then, the gap between its increasing use and the lack
of a legislative basis has grown wider and wider. In that time,
many thousands of people have had their personal data captured
and used by the police when there was absolutely no reason for
that. Many people have been misidentified, but the accuracy issue
is not my main concern.
The unlegislated use of the technology is incredibly worrying. In
my previous job on the London Assembly, I asked the Met and the
Mayor of London many questions about that. I asked for watchlist
transparency, but I did not get it. I heard the initial
promises—“Oh, it will be very transparently used, we will
communicate it, and no one will have to walk past it without
knowing.” All those reassurances just faded away, because there
is no real scrutiny or legislation. We need to debate the subject
from first principles. As other Members have pointed out, we have
had proper debates about identity cards and fingerprint and DNA
data, but not about this extremely intrusive technology. It is
more concerning than other technologies because it can be used on
us without our knowledge. It really does engage our human rights
in profound ways.
For all those reasons, the use of facial recognition by the
police has been challenged by the Information Commissioner, the
Surveillance Camera Commissioner, the Biometrics Commissioner,
London Assembly members, of whom I was one, Senedd Members and
Members of Parliament here. The only detailed scrutiny of the
technology has resulted in calls for a halt to its use; I am
thinking of the Science, Innovation and Technology Committee. The
Justice and Home Affairs Committee has also called for primary
legislation. That is the absolutely key question. The EU has had
the debate and looked at the issue in detail, with the result
that over there what is used so much by the UK police is
restricted to only the most serious cases of genuine public
safety. That absolutely needs to happen here.
The legislation needs to look not just at police use of the
technology, but private use. I have seen its use by private
companies in the privately owned public space in King's Cross.
Data from there has been shared with the police; the police
initially denied knowing anything about it and then later
apologised for that denial. If private companies are collecting
data and sharing it with the police, that needs to be
scrutinised. If private companies are using the technology, that
needs to be legislated for as well.
The hon. Lady is making an incredibly powerful speech. Is she
aware of the Big Brother Watch campaign to try to stop large
shops from capturing people's faces and saying that they are
shoplifters? They then get stopped in other places, but they are
not aware of that process.
Yes, I am aware of Big Brother Watch's excellent campaigning on
this issue. It has identified a serious breach of human rights.
There is the potential for a serious injustice if people are
denied access to their local shops based on a suspicion that has
put them on a watchlist that may or may not be accurate. There is
no oversight. We need to debate these things and legislate for
them.
I tabled a written question to the Minister about putting
regulation and legislation behind the police use of live facial
recognition. The answer stated that the technology is governed by
data protection and equality and human rights legislation, and
supplemented by specific police guidance. I do not believe that
police guidance is sufficient, given the enormous risks to human
rights. We need a debate on primary legislation. I hope that the
Minister will announce that that process will start soon and that
this unlawful grey area will not be invading our privacy for much
longer. This issue is urgent.
10.13am
(South Basildon and East
Thurrock) (Reform)
I appreciate that we are having this debate, because it is
surprising that we have got to where we are without legislation
and firm frameworks in place. I really like the phrase “first
principles”, and one of the first principles of the police is
“without fear or favour”. That is an exceptional phrase that, if
perfectly implemented, we would all benefit from, although of
course we recognise that in the real world there is no such thing
as perfect.
I am grateful that concerns have been raised about how the
technology we are discussing impacts the assumption of
innocence—we should all be very careful about that—although I
also appreciate the point that it does not impact innocence but
provides the opportunity for a human to check. If done properly,
that is no bad thing, but we are right to discuss the issue in
serious terms in our legislature because there is a danger of an
unofficial assumption of guilt. Let us take the example of local
shopping centres, which we heard about earlier. If an issue has
not been escalated to the police or courts, but some local
security officers have seen the same images on cameras and that
information has gone round by radio, a gentleman or a lady out
with their children doing the weekly shop may suddenly not be
able to get in and do what they need to do. That is the kind of
pervasive and damaging thing that could easily slip under the
radar; we should all be mindful of that.
I want to touch briefly on transparency. This is clearly a
developing technology and we would be wrong not to look at its
benefits, but we must be mindful of the harm it could do along
the way. If people find that they are getting an unfair crack of
the whip—that is probably an inappropriate term—and are suffering
as a result of this technology, we need to nip that in the bud,
and be very direct and open about the failures so that we can
make adjustments.
Is the hon. Gentleman aware that black men are eight times more
likely to be stopped and search by the police than their white
counterparts, and 35 times more likely under section 60? This
technology accelerates the discrimination that is already in the
system.
Absolutely. Let me put it like this: if any of us were to turn up
at a social event and unexpectedly find a large swarm of police,
that would give us a moment's pause for thought. We need to be
careful to ensure that this technology is not a more pervasive
version of that example. It must not be constantly in existence,
attached to every CCTV camera, without us even being aware of
it.
To go back to transparency, we have to be open and frank about
any issues with how the technology is being implemented, so that
we can fix them. I agree that there absolutely could be issues,
and we definitely want to be on the right path.
Does the hon. Gentleman agree that this technology could further
alienate minority communities —as happened with the Muslim
community, which felt unfairly targeted by the Prevent
strategy—and could cause further division and mistrust of the
police?
This is all about the first principle of “without fear or
favour”. If there are any examples of where that is failing,
regardless of whether it relates to local behaviour or the
broader introduction of a new technology, we need to be open,
transparent and mindful. We live in a world in which not
everything is done perfectly, but there are some communities with
problems that are perhaps not being tackled in the most
beneficial way. I do not want to get too deeply into these
issues, because I am not an expert and I recognise that they are
extremely sensitive, but I think we can tackle them
transparently.
The hon. Member for Brent East () used the excellent analogy of
a night out. I completely agree; I was thinking, “Yeah, I'm up
for it, but let's just make sure we can all get home safe”, but
the more we discuss the issue, the more I think the appropriate
camp to be in is, “I could be tempted out, but let's make sure we
like the destination.” I will leave it there. I thank hon.
Members for their time.
10.18am
(Dewsbury and Batley)
(Ind)
It is a pleasure to serve under your chairship, Dame Siobhain. I
thank the right hon. Member for Maldon (Sir ) for securing this
important debate.
I have researched this subject and listened to hon. Members'
contributions, and it has been frankly shocking to learn that LFR
has been in use since 2017 without any specific legislation in
place to control its use and protect our civil liberties. That is
seven years too many without legislation. Although I agree that
the use of real-time facial recognition in the United Kingdom
promises enhanced security and efficiency, it also raises
significant legal and moral concerns, and there are severe
adverse consequences for our society.
As a former software test manager, I am extremely concerned that
private companies that profit from their technology are allowed
to self-regulate and to confirm the efficacy of the products that
they sell, and that the police are guided by those companies in
how to use the tools and rely on the companies' reports of their
efficacy to take legal action against innocent civilians.
The technology operates by capturing and analysing highly
sensitive and personal biometric data. As has been mentioned, the
legal framework for its use is complex and at times insufficient.
The Data Protection Act 2018 and the General Data Protection
Regulation provide some safeguards, requiring data processing to
be fair, necessary and proportionate. However, the lack of
specific legislation for facial recognition technology leaves
huge room for misuse and overreach.
The deployment of this technology without explicit consent
undermines several of our fundamental rights, some of which have
been mentioned. The first is the right to privacy: constant
surveillance and the collection of biometric data without
explicit consent infringe an individual's privacy rights. This is
particularly concerning when the technology is used in public
spaces without people's knowledge. The second right is the right
to freedom of peaceful assembly and expression. The use of facial
recognition can deter individuals from participating in protests
or public gatherings due to the fear of being monitored or
identified. This undermines the fundamental right to assemble and
express opinions freely.
The third right is the right to non-discrimination. As has been
mentioned, facial recognition systems have been shown to have
higher error rates for people of colour, women and younger
individuals. This bias can lead to disproportionate targeting and
wrongful arrests, exacerbating existing inequalities and
discrimination. The final right is the right to data protection.
The collection, storage and processing of biometric data must
comply with data protection laws. Inadequate safeguards can lead
to unauthorised access and misuse of personal data.
My hon. Friend the Member for Leicester South () cited examples of how this
technology is used in Russia and China, and we know that it is
used extensively in Israel as part of its apartheid regime and
occupation of the Palestinian people. Violations highlight the
need for strict regulation and oversight to ensure that the
deployment of facial recognition technology does not infringe
fundamental human rights. The technology subjects individuals to
constant surveillance, often without their knowledge, eroding
trust in public institutions. The ethical principle of autonomy
is compromised when people are unaware that their biometric data
is being collected and analysed.
Let me cite some examples of the technology's inefficacy and
unreliability. In 2020, the Court of Appeal found that South
Wales police's use of facial recognition technology was unlawful,
and that the force had breached privacy rights and failed to
adequately assess the risks to individual freedoms. The
technology's accuracy is not infallible: misidentifications can
lead to miscarriages of justice, where innocent individuals are
wrongly accused or detained.
The disproportionate impact of FR technology on black people and
people of colour is particularly concerning. Research has
consistently shown that these systems are more likely to
misidentify individuals from those groups. For example, a
National Institute of Standards and Technology study—I do not
know how old it is—found that FR algorithms were up to 100 times
more likely to misidentify black and Asian faces than white
faces. This disparity not only undermines the technology's
reliability, but perpetuates systemic racism. In practice, this
means that black people and people of colour are more likely to
be subjected to unwanted surveillance and scrutiny, which can
lead to a range of negative outcomes.
There are other examples of miscarriages of justice and misuse.
In one instance, the Metropolitan police used FR technology at
the Notting Hill carnival, leading to the wrongful identification
and harassment of innocent individuals. These and the other
examples cited by hon. Members underscore the potential for
significant harm when this technology is deployed without
adequate safeguards.
In conclusion, although facial recognition technology offers
potential benefits, its deployment must be carefully regulated to
prevent misuse and protect individual rights. The legal framework
needs to be strengthened to ensure that the use of technology is
transparent, accountable and subject to rigorous oversight. We
must also address the inherent bias in these systems to prevent
further entrenchment of racial inequalities. As we navigate the
complexities of integrating new technologies into our society,
let us prioritise the protection of our fundamental rights and
ensure that advancements serve to enhance rather than undermine
our collective wellbeing.
10.26am
(Birmingham Perry Barr)
(Ind)
I thank the right hon. Member for Maldon (Sir ) for securing this
important debate. It is difficult to follow the comprehensive
presentation of the hon. Member for Dewsbury and Batley (), but I would like to come
at this debate from a different perspective—as a criminal
barrister by profession. In trials that I have conducted, we have
had difficulty with experts when identifying suspects charged in
very serious cases. Two experts in relevant IT facial recognition
software find it difficult to come to the same conclusion. One
expert in a trial will say, “This is highly likely fitting of
this particular defendant—confidence level maybe 50% or 60%.”
Another expert in the same trial will counterargue and say,
“Well, there are dissimilarities between the face and the image
that we have been able to capture.” Ultimately, it is a matter
for the jury as to whether they accept one expert's opinion over
another. As a result, at present, we have counterarguments
between experts over facial recognition technology.
What concerns me is the idea of allowing the state, in essence,
to deploy this kind of technology in high streets, for example.
The hon. Member for Brent East () has already raised the issue
of the disproportionate rate of stop and search—by multiple
times; I think the rate was nine times higher for black males.
What impact will facial recognition live transmission data have
in the city of Birmingham? It is going to have an enormous
impact. Members have raised the difficulties with the percentage
error of recognitions, and the distrust that we have in
Birmingham is a challenge already, particularly with young men
and the police. What will this technology achieve? Will young men
start wearing more face coverings in city centres? How will this
technology be used, even if it is legislated for properly? For
example, will the police have to notify the public, “We are using
this facial recognition technology in the Bullring today between
the hours of 10 am and 10 pm?” It does not seem to serve any real
purpose.
We have a very effective police force in the west midlands, and
it uses CCTV, which we have all over. If hon. Members go to any
street in Birmingham, they will find tens or hundreds of houses
with CCTV, and the police have used that to great effect; after a
crime is committed, they track back and they prosecute. We have
had so many successful prosecutions in very serious crimes, such
as murder and violent crime, but the deployment of this
technology will create enormous problems and divisions. As I
said, there are already problems with how minority communities
feel when they are stopped and searched. I think the right hon.
Member for Maldon said that in the trial about 10 people were
stopped, with one to two—as little as 10%—being identified. As
the technology develops further, that percentage may increase,
but at present I do not see how it will assist at all. Criminals
know very well how to avoid detection, and face coverings will
become the norm. Other than surveillance, this technology
achieves very little. I do not see how it will assist in
detection.
The hon. Member for Brent East drew some simple parallels. What
would the public think about being stopped on a busy high street
and asked to come to a police van to give their fingerprints and
DNA? They would be outraged, and rightly so. It would almost
legitimise police officers approaching people, in particular
young men. We know that not just black people, but people of
colour, women and children will be subject to the technology, and
we know that there are errors. The right to privacy and the
freedoms that we have are far greater than this technology, and I
do not see how it will assist in deterrence, because people will
simply use face coverings and all sorts of other things.
Dame (in the Chair)
I see no other Back Benchers who wish to contribute, so I call
the Liberal Democrat spokesperson.
10.32am
(Carshalton and Wallington)
(LD)
Thank you, Dame Siobhain. I thank the right hon. Member for
Maldon (Sir ) for securing this
debate. It is shocking that this might be the first proper debate
we have had in this place on this topic.
We have discussed whether live facial recognition technology is a
legitimate tool and, if so, under what circumstances and controls
it should be used. It is clear from the debate that there are
many doubts, and we should probably be thinking about halting the
use of the technology until we have cleared them up.
I will start with the concern about discrimination, which was
articulated well by the hon. Member for Brent East (). It is clear that black people
and people from other communities are likely to be
disproportionately misidentified by this technology.
I want us to be careful that we are not making assumptions that
may not be right. I am not taking a firm position, but there have
been a number of comments, from several parts of the Chamber,
about racial disparity. It would be remiss of me to let those
things be said without making the point that I am not 100% sure
that they are all accurate. For example—
Dame (in the Chair)
Order. I apologise, but could the hon. Member please explain
briefly what his intervention is?
Of course. The topic of racial disparity is one we should all
treat extremely seriously—possibly one of the most serious things
we can do to benefit our society is to discuss this and get it
right—but can we please not make any leading assumptions? We live
in a fair and good society. If someone listened to this debate in
isolation, they might get an impression that I do not believe
would be strictly fair.
I thank the hon. Gentleman for his intervention, but the evidence
is quite clear in this area. Somebody might watch this debate and
have doubts, but the research is quite clear.
Further to the point made by the hon. Member for South Basildon
and East Thurrock (), just about every time
that somebody has stated that there are issues of racial
discrimination with this technology, they have cited sources that
people can look at. For the benefit of both the public and the
hon. Member, it is important to note that these are not just
assumptions; they are based on data and evidence. There is
further evidence we could give, such as my personal experience
and the experiences of others, but those specific points were
made with evidence.
I agree with the hon. Member that some evidence has been cited in
the Chamber today, but there is other evidence that we can look
at. Let us not forget that the technology exacerbates the known
problem—particularly with the Met police in London, where I
live—that black communities feel over-policed and underserved.
That has built up over time, and the use of this technology could
exacerbate that problem further.
The hon. Member for Leicester South () made a comment about how
polite conversations do not always register as polite
conversations. That is because of the persistence of those
conversations over time. A repeated polite conversation starts to
become an aggressive conversation to the person on the receiving
end, if it is that persistent. There was also discussion about
the findings of the national physical laboratory, but it is clear
that those findings are disputed—[Interruption.] Well, it is
clear that they are disputed; they have been disputed in the
Chamber today. Until we get to the bottom of that, we need to
think carefully about the controls that we have in relation to
discrimination.
I want to talk about the general principle of privacy. As a
liberal, I feel a general depression about how we have come to
devalue privacy in society, and how we trade it away far too
readily for other societal aims. We often hear the claim, “If
you're not doing anything wrong then there's nothing to worry
about,” as if the only value of privacy were to hide things that
someone might be doing wrong. That is not the case. Privacy
delivers so much more than that. It delivers personal wellbeing
and gives people control over their own data. It allows us to
have freedom of association and dignity. We need to think very
carefully before we so readily trade away the principle of
privacy in pursuit of other goals in society.
The opportunity for slippage has been discussed at length. One
would think that such technology would come with strict controls,
but it is clear that at the moment we have the opposite; in fact,
Big Brother Watch has described it as a “legal vacuum”. The hon.
Member for Brighton Pavilion () talked about the creeping
expansion of its use in London. I have seen that myself; what
started off being limited to large-scale events, such as football
matches, has turned into routine trials on high streets, such as
mine in Sutton.
We have also seen expansion in the photos that are used. The
technology started off using only photographs of people known to
the police, for good reason, but it has been expanded to
potentially including everyone who has a passport or driving
licence photo. What started being strictly about warrant breakers
and sex offenders could expand to be about pretty much anything
the Government of the day want. If we think about the clampdown
on protest under the previous Government, that potentially has a
chilling impact on the right to freedom of association.
With all of those doubts, it is clear that we need proper
parliamentary consideration of the issue. The Lib Dems ask the
Minister to immediately halt the roll-out of live facial
recognition technology until we get it right. It should be down
to this place to determine the correct controls and whether there
is a legitimate use of the technology at all, given all the
concerns about discrimination and privacy. Privacy is a
fundamental civil liberty. We have undervalued it far too much in
recent times. This is an opportunity to protect it, and we should
take it.
10.39am
(Croydon South) (Con)
It is a pleasure, as always, to serve under your chairmanship,
Dame Siobhain. I congratulate my right hon. Friend the Member for
Maldon (Sir ) on securing the debate
and on the characteristically thoughtful manner in which he
approached his speech.
I think this is the first time that I have appeared opposite the
new Minister for Policing, Fire and Crime Prevention—the job that
I was doing until a few months ago—so let me congratulate her on
her appointment. Although I will of course hold the Government to
account, I will do everything I can to constructively support her
in making a great success of the job, and I really do wish her
well in the role.
I want to start by reminding colleagues of the way that live
facial recognition works. It is different from retrospective
facial recognition, which we have not debated today and, in the
interests of time, I do not propose to go into. As some Members
have already said, live facial recognition starts with a
watchlist of people who are wanted by the police. It is not the
case that anyone can get on that watchlist, which generally
comprises people who are wanted for criminal offences—often very
serious offences—people who have failed to attend court, and
people who are registered sex offenders, where the police want to
check that they are complying with their conditions. As people
walk down a high street, they are scanned, typically by a CCTV
camera on a mobile van, and then compared to the watchlist. The
vast majority of people are not on the watchlist, as we would
expect, and their image is immediately and automatically deleted.
Where a person is on the watchlist, the police will stop them and
ask if they have any form of identification.
To be very clear, no one gets convicted on the basis of that
facial recognition match, so it is not overturning the
presumption of innocence, and if it turns out that the person
stopped is not the person on the watchlist, obviously they can
continue on their way. However, if they are the person on the
watchlist, a normal criminal investigation will follow, with the
normal standards of evidence.
On the point about the automatic deletion of data, there are many
examples, but the one I can remember is Google incognito browsing
mode. That was meant to be very private—only you saw where you
went—but Google was found to be storing that data, and it has
been legally challenged and prosecuted for breaching the GDPR or
other privacy laws. Companies may say that things are immediately
deleted, but it is not always true.
That is a good point; we must ensure that the operating
procedures are adhered to, and I will come on to that a little
later. However, to be absolutely clear, if someone is identified
as a match, a normal criminal investigation is conducted to
normal criminal standards. Nobody is convicted on the basis of
this evidence alone—or, indeed, on the basis of this evidence at
all.
Let me come to the question about racial disparity. When this
technology was first introduced, about seven years ago, there
were reports—accurate reports—that there was racial bias in the
way that the algorithm operated. The algorithm has been developed
a great deal since those days, and it has been tested
definitively by the national physical laboratory, the nation's
premier testing laboratory. NPL testing is the gold standard of
testing and this technology has been tested relatively recently.
For the benefit of Members, I will read out what the results of
that testing were:
“The NPL study found that, when used at the settings maintained
by the Met”—
that is the 0.6 setting that the hon. Member for Brent East
() referred to earlier—
“there was no statistically significant difference in the facial
recognition technology's accuracy across”
different demographic groups. In other words, the technology as
it is being used today—not five years ago, when there were
issues—has been certified by the NPL and it has been found that
there is not any racial bias at the settings used.
But when we look at the numbers of people, something like 0.5% of
scans—I cannot remember the statistic—still result in somebody
being misidentified.
On the misidentification rate, I think the Bridges court case set
a standard of a false positive rate of one in 1,000: out of every
1,000 people stopped, 999 are the people the police think they
are, while one is misidentified. The Minister may have more
up-to-date figures, but from my recollection the system in
practice is running at about one in 6,000. That is an
extraordinarily high accuracy rate—much more accurate than a
regular stop and search.
About 25% to 30% of regular physical stops and searches, where a
police officer stops someone and searches them for drugs or a
knife or something, are successful. About 70% are unsuccessful,
while the equivalent figure for live facial recognition is 0.02%.
That means that this technology is 4,500 times less likely to
result in someone being inappropriately stopped than a regular
stop and search. It therefore hugely—by three orders of
magnitude—reduces the likelihood of someone being improperly
stopped and searched.
I turn to the use of the technology on the ground. I asked for it
to be trialled in the centre of Croydon, which is the borough I
represent in Parliament. Over the past nine months or so, it has
been deployed on a relatively regular basis: about once a week. I
believe that the Minister was supposed to go down this morning to
have a look; I certainly encourage her to go again as soon as she
can. By the way, the hon. Member for Birmingham Perry Barr
() asked whether people know when
the technology is being used. The answer is yes: one of the
guidelines is that public signage must be displayed telling the
public that the technology is in use.
Over that period in Croydon, there have been approximately 200
arrests of people who would not otherwise have been arrested,
including for all kinds of offences such as class A drugs supply,
grievous bodily harm, fraud and domestic burglary. It has also
included a man who had been wanted for two rapes dating back to
2017. That wanted rapist would be free to this day if not for
this technology. Just a couple of weeks ago, a man was stopped
and subsequently arrested in relation to a rape allegation from
June this year. There are people who are alleged to have
committed rape who would not have been stopped—who would still be
walking free—if not for this technology. It is only the fact that
they walked past a camera outside East Croydon station or
somewhere that has meant they were stopped by the police. They
will now have a normal trial with the normal standards of
evidence, but they would not have been caught in the first place
if not for this technology.
I have done quite a lot of public meetings on this. I explain,
“These are the people who get caught, and the price the public
pay is that you might get scanned when you walk down Croydon High
Street, but if you are innocent your picture is immediately
deleted.” By and large, the overwhelming majority of the people
in Croydon think that a reasonable trade-off.
There should be protections, of course. Several hon. Members,
including my right hon. Friend the Member for Maldon, have
rightly said that there should be guidelines, rules and
procedures. However, it is not true that there is a complete
vacuum as far as rules and regulations are concerned. The Bridges
case at the Court of Appeal in 2020 looked at how South Wales Police were
using the technology between 2017 and 2020. It found that some of
the ways they were using the technology were not appropriate
because they broke rules on things like data protection privacy.
It set out in case law the guidelines that have to be adhered to
for the technology to be lawful—things like public signage, the
rate of accuracy and having no racial bias.
Secondly—I do hope I am not taking the Minister's entire
speech—there are guidelines for police. The College of Policing
has national authorised professional practice guidelines that the
police are supposed to stick to. There is a debate to be had
about whether, for the sake of clarity and democratic
accountability, we in Parliament should set something out more
formal; my right hon. Friend the Member for Maldon made that
point. I think there would be some merit in clarifying at a
national level where the guidelines sit, but I would not go as
far as Europe. If we had done so, those rapists would not have
been arrested. I would also be careful to ensure that any
legislation is flexible enough to accommodate changing
technology. Primary legislation may not be the right vehicle: a
regulation-making power might be a more sensible approach, so
that things can be kept up to date from time to time.
While we consider that, I strongly urge the Minister not to halt
the use of the technology. As we speak, it is arresting criminals
in Croydon and elsewhere who would not otherwise be caught. I
urge her to continue supporting the police to roll it out. I
think some money was allocated in the Budget for the current
financial year, to continue developing the technology. I would
welcome an update from the Minister on whether that money is
still being spent in the current financial year. I do hope it has
not somehow been snaffled by the Treasury in a misguided
cost-saving effort—
Dame (in the Chair)
Order. I apologise for interrupting the shadow Secretary of
State, but I am looking at the time. I am sure hon. Members would
like to hear from the Minister.
None more so than me. I will conclude by saying that this is an
important technology: it takes people off the streets who would
otherwise not be caught. The Minister has my support in
continuing its roll-out and deployment.
10.49am
The Minister for Policing, Fire and Crime Prevention (Dame )
It is a pleasure to serve under your chairmanship, Dame Siobhain.
I congratulate the right hon. Member for Maldon (Sir ) on securing this
important debate. I am grateful to him and all other right hon.
and hon. Members who have made thoughtful and insightful
contributions this morning.
I congratulate the right hon. Member for Croydon South () on his new role. When he held
the role that I now hold, he was very passionate about this
subject. That passion is demonstrated today by the number of
interventions he has made and by his contribution in defence of
the previous Government's approach to this particular policy. Now
that we have seen the shadow Home Secretary in a Westminster Hall
debate on this issue, I very much hope that we might see him here
again when we debate the many other policing issues that we have
to deal with, including police reform and police
accountability—the list goes on.
This has been a very good debate. We have ranged from discussing
the Jason Bourne films to a night out with my hon. Friend the
Member for Brent East (). We have also had excellent
contributions from the hon. Member for Strangford (), my hon. Friend the Member for Clapham and Brixton
Hill (), the hon. Members for
Leicester South (), for Brighton Pavilion
(), for South Basildon and East
Thurrock (), for Dewsbury and Batley
() and for Birmingham Perry
Barr (), and the Liberal Democrat
Front-Bench spokesperson, the hon. Member for Carshalton and
Wallington ().
I will deal with the complex issues that Members have set out so
eloquently. At the heart of the issue is the fact that we are
dealing with a powerful technology that has the potential to be
transformational for policing. However, some have very legitimate
concerns about it, including misidentification, misuse and the
effect on human rights and individual privacy. I agree
wholeheartedly that we need a proper, informed debate on the
subject, both in this House and with the public, and I am pleased
that we have had the opportunity to start that today.
Let me quickly run through the current use and benefits of live
facial recognition, which, as we have discussed, allows the
police to spot people in crowds. It uses live video footage of
crowds passing a camera and compares their images to a specific
watchlist of people wanted by the police. As well as Essex
police, who we have heard about, the Metropolitan police
and South Wales Police have
been using this technology for a number of years. In fact, as the
shadow Home Secretary said, I was due to go and see one of the
deployments this morning, but then this debate was scheduled, so
I am going to see it for myself this afternoon instead.
I am told by the Metropolitan police that between January and
November this year they made over 460 arrests as a result of live
facial recognition deployments, including for offences such as
rape, domestic abuse, knife crime and violent robbery. In
addition, over 45 registered sex offenders have been arrested for
breaching their conditions. South Wales
Police tell me that between January and November, they
deployed live facial recognition locally on 20 occasions,
resulting in 12 arrests. They also located a high-risk missing
young girl, who they were able to safeguard from child sexual
exploitation and criminal exploitation. Essex police, as the
right hon. Member for Maldon has attested, have also had
considerable success in their use of this technology.
The potential of live facial recognition to contribute to our
safer streets mission is clear. It could make our streets safer
for us all, particularly for women and girls, by helping the
police to identify wanted people quickly and accurately. It could
also save precious police time. Rapid advances in the technology
and improvements in the accuracy of algorithms increase that
potential.
Let me consider the concerns that have been raised. I was pleased
to hear that the right hon. Member for Maldon was impressed by
the strict limit that Essex police have put on their use of live
facial recognition. That includes use of the narrowly drawn
watchlist and the immediate deletion of images. However, I note
his worries about the lack of a specific legal framework for the
technology's use. It is therefore important to be clear that
facial recognition is covered by data protection, equality and
human rights law as well as common law powers and detailed
guidance from the College of Policing. However, the right hon.
Member is right that no one specific law gives the police the
power to use live facial recognition.
The Ada Lovelace Institute, an independent research institution
with a mission to ensure that data and AI work for people and
society, has written to the Home Secretary to express similar
concerns to those of the right hon. Member. It believes that the
only way to scale up those technologies safely and successfully
is through the introduction of a statutory regulatory framework.
I have spoken to senior police leaders about the matter, and some
believe that the lack of a specific legal framework inhibits
their use of the technology and dampens willingness to
innovate.
With legal challenges highly likely, it is not surprising that
some police forces are reluctant to use the technology. However,
others in policing are keen to emphasise the safeguards that are
already in place. For example, they assure me that the police do
not keep the biometric data of people filmed during live facial
recognition deployments, that watchlists are bespoke and that the
police deploy the technology only when there is an intelligence
case for doing that. I have also been assured that there will
always be a human being in the loop to decide whether to
apprehend someone. That would never be done solely on the basis
of a match made by a computer.
Privacy campaign groups have a long-standing interest in the
subject. I am aware of their concerns, as well as previous and
ongoing legal actions relating to police use of live facial
recognition technology. Potential bias in the algorithms used for
live facial recognition systems is another frequently raised
concern. Questions have been asked today about that very point
and whether live facial recognition discriminates against people
on the grounds of gender or race. I am also aware that 65 Members
of Parliament and peers signed an open letter last year that
called for a ban on live facial recognition, and that in January
the House of Lords Justice and Home Affairs Committee sent the
then Home Secretary a report raising concerns and making
recommendations about live facial recognition.
I remind Members that the Government have been in post for five
months. Let us put that in the context of the previous 14 years
of Conservative Administrations. The Government want to take time
to listen and to think carefully about the concerns that have
been raised and about how we can best enable the police to use
live facial recognition in a way that secures and maintains
public confidence.
As we have heard today, facial recognition technology is a
powerful tool. In considering its current and future use, we must
balance privacy concerns with the expectation that we place on
the police to keep our streets safe. We particularly need to
consider how much support the police may require from Government
and Parliament to set and manage the rules for using technologies
such as facial recognition. We must think about how we protect
the public from potential misuse of those technologies, and we
need to consider how the application of the rules and regulations
is scrutinised.
I am therefore committed to a programme of engagement in the
coming months to inform that thinking. Building on initial
conversations with police, I will hold a series of roundtables,
for example, with regulators and civil society groups before the
end of the year. I look forward to hearing at first hand from a
broad range of parties on the subject.
I am running out of time. I want to say much more on this issue,
and I want to confirm that money is being spent this year on the
roll-out of the live facial recognition vans that are being
equipped to carry out this work. There is a full evaluation of
that work going on. I very much look forward to the House having
further opportunities to debate the issue in the coming weeks and
months.
|