Mr Speaker We now come to the Select Committee statement. Julian
Knight will speak for up to 10 minutes, during which no
interventions may be taken. At the conclusion of his statement, I
will call Members to put questions on the subject of the statement
and call Julian Knight to respond to those questions in turn.
Members can expect to be called only once. Interventions should be
questions and should be brief. Front Benchers may take part, but I
remind everybody that we have...Request free trial
Mr Speaker
We now come to the Select Committee statement. will speak for up to 10
minutes, during which no interventions may be taken. At the
conclusion of his statement, I will call Members to put questions
on the subject of the statement and call to respond to those questions
in turn. Members can expect to be called only once. Interventions
should be questions and should be brief. Front Benchers may take
part, but I remind everybody that we have a very important debate
to follow and I hope that people will try to ensure that we get
on to that debate as early as possible. I now call the Chair of
the Digital, Culture, Media and Sport Committee—Julian
Knight.
11.50am
(Solihull) (Con)
Thank you, Mr Speaker. With your words of endorsement ringing in
my ears, I will ensure that I am as brief as the subject can
allow.
I am grateful to have been granted this statement to discuss the
DCMS Committee’s report on the draft online safety Bill. This is
an important piece of legislation that, if done right, will
prevent a tremendous amount of harm to so many in our society.
The ultimate aim for all of us involved in the production of the
Bill is to make user-to-user and search service providers more
accountable for decisions they make when designing their
platforms and the systems and processes that govern them. The
Committee I chair has a crucial role in ensuring that that is the
ultimate outcome of this work. While I welcome large parts of the
Bill’s content in draft form, there are some elements that do
need work so that we do not miss the opportunity to make the
internet a safer space for all, while protecting freedom of
expression.
One such area of particular concern to the Committee is that the
Bill in its current form lacks clarity on what falls within the
parameters of illegal content and in its treatment of legal but
nevertheless terribly harmful content. For example, the Committee
was alarmed to hear in evidence so many examples of online abuse
towards women and girls that would not be adequately covered by
the Bill in its current form. We are all aware of frankly
appalling images being shared online without the consent of those
pictured, some of whom are underage. Many of these would be
covered by the Bill, but not all.
Furthermore, the internet is awash with images that are often
edited to cause harm and are clearly not within the scope of the
Bill. My Committee’s report seeks to tackle this. We also have
concerns about the less immediately obvious examples of abuse
such as breadcrumbing—leading someone on virtually with a series
of digital breadcrumbs on the way to illegal and harmful
material. In such instances, the context of these communications
is key. Some examples of online abuse that we have heard in our
investigations are insidious—inch by inch, step by step, allowing
people, often children and teenagers, to be lured in. In such
instances, no one message, picture or like is technically
illegal, but they none the less form part of a sequence of online
child sexual exploitation and abuse. The Bill can and must stop
this. For this reason, we propose reframing the definition of
“illegal content” to include context.
The Committee was truly shocked by the repeated examples of
cyber-flashing and deliberate manipulation of images such as
tech-enabled nudifying of women and deepfake pornography, which
currently go unchecked. The deliberate manipulation of images to
circumnavigate content moderators is egregious in its own right.
It is also a key hallmark of potential child exploitation. This
Bill, if crafted correctly, can and must protect children from
such acts and such tactics. In its current form, it does not
adequately cover these examples of truly harmful content. As
such, we propose that they should be included in the Bill and
covered by the duties of care in it.
Another area that many Members are rightly deeply concerned by is
the many examples of inherently harmful activity that are not
illegal. We support the Joint Committee in its view about harmful
actions such as cyber-flashing, and people with photosensitive
epilepsy being targeted by trolls sending malicious flashing
images with a deliberate intent to trigger a seizure: these
offences, in all the senses that we would understand, must be
included in the Bill.
Finally, I come to the issue of scrutiny. The current provisions
in the Bill to provide Ofcom with a suite of powers to address
such actions are unclear and impractical. We urge the Government
to bake in best practice by providing greater clarity in the Bill
on when and how these powers should be used to ensure that they
are both practical and proportionate. We recommend that there
should be compliance officers in the social media companies, paid
for by those companies, baking in that best practice. That will,
hopefully, also lead to the ending, or at least reduction, of
unwarranted take-downs.
The present situation is deeply unsatisfactory. Effectively,
social media companies are editors-in-chief of the content on
their sites. There is no say, and no transparency. They act
according to their terms and conditions, which they decide. That
can lead—and has led in the past—to unwarranted take-downs, and
the people who suffer those take-downs then have to appeal to the
social media companies. This is not right. It is against freedom
of speech. We need proper systems so that transparency and
know-how on the ground can ensure that any such issues of
take-down are set against clear parameters. That can, I believe,
be regulated in the same way as financial services are
effectively regulated—through a strong compliance regime.
We specifically recommend that the Government reframe the
language relating to freedom-of-expression considerations to
incorporate a “must balance” test, to enable Ofcom, and the
compliance officers whose introduction we propose, to assess
whether providers have duly balanced their freedom-of-expression
obligations with their decision making, thereby preventing
unjustified take-downs of material.
Our Committee has made clear that it strongly disagrees with the
recommendation of the now defunct Joint Committee—which did
amazing work in this area—that a permanent Joint Committee be
established as
“a solution to the lack of transparency and…oversight”.
We disagree with that proposal for a range of reasons, but not
least because it would set a precedent which could be written
into any other Bill and could then effectively circumnavigate the
Select Committee system. I think the Select Committee system is
the jewel in the crown of this House, and I say that not just
because I have a personal interest in it. This, I think, is
something we can do ourselves. If there is a need for
pre-legislative scrutiny, Select Committees should be able to
deal with it, but in any event the Government are free to set up
a framework of pre-legislative scrutiny which may be on a one-off
or ad hoc basis. That has happened before after a period of time
in the case of other Acts that have passed through this
place.
I welcome wholeheartedly the aims of this Bill and much of its
content. I hope and expect the Department to be in listening
mode—I know that the Minister personally is absolutely committed
to that—so that we can all work together to ensure that the aim
and the reality of the Bill are aligned, and we can make the
internet a safer and a better place that is more in tune with
what I would describe as the health of our society.
(Pontypridd) (Lab)
I thank the hon. Member for Solihull () and the other members of the
Committee for their hard work in delivering this important
report. Having previously been a member of the Committee, I am
all too well aware of the challenges to online safety,
particularly in the context of defining or contextualising what
constitutes a “harm”. The Labour party has long called for
tougher penalties for tech companies which fail to comply with
their responsibilities to users; a change in culture is clearly
urgently required for those companies, which have been left
unaccountable for far too long.
The report has also highlighted a number of issues, or omissions,
in the Government’s current draft bill, and I am keen to hear the
hon. Gentleman’s thoughts on those. First, the report recommends
that providers should have designated compliance officers to
ensure good governance. This is not the first time that that
recommendation has been made, but the proposal has been
discounted until now. Does the hon. Gentleman agree that the
Government have been too slow in pushing social media companies
to act?
Secondly, the Secretary of State, in her evidence to the Joint
Committee assessing the draft online safety bill, referred to
legal advice that she had received, including advice on a
foundational duty of care. Does the hon. Gentleman agree that it
is vital for the Government to publish that legal advice ahead of
the response to the DCMS Committee’s report, so that their reply
can be understood in the context of the advice that they have
received? I am sure the hon. Gentleman will agree—especially
given events that have unfolded in relation to other matters this
week—that it is simply not acceptable for the Government to
conceal important advice from the public domain.
I thank the hon. Member for her kind question, but also for her
acknowledgment of the ongoing work of the Select Committee, on
which she played a fantastic role during her time with us.
The hon. Member references compliance officers, and the key, of
course, is to make the regime pre-emptive rather than reactive. I
think that actually helps freedom of expression, basically
because if we in effect have this baked into the system, there is
less chance of take-downs as a result.
When it comes to social media companies and the Government’s
interaction with them, there is an idea that the Government have
in effect run scared of social media and the huge lobby. These
are the new masters of the universe—the new oil companies, the
new banking institutions—and they have huge and enormous powers.
I think it is therefore beholden on the Government to draw from
every part of this House in order to come up with a framework
that can best bring them in to be good citizens in our society. I
am hopeful of the time when is not perhaps as welcome in
putting his views, but is in that regard perhaps the same as
Members in this place. I do concur to some degree with the hon.
Member, but every Government in the world is also facing this
huge issue.
On publishing legal advice, I do believe wholeheartedly in
complete transparency. I think that part of the process of being
cross-party and getting this Bill right actually should be
absolute transparency when it comes to such matters.
(Kenilworth and Southam)
(Con)
I congratulate my hon. Friend and his entire Committee on this
report into what he correctly describes in the report as a very
“complex” Bill. Given its complexity, does he agree with me that
it is very important that the Government response both to his
Committee’s report and, indeed, to the report of the Joint
Committee on the Draft Online Safety Bill is not just
substantive, but timely and reaches all of us well in advance of
Second Reading of the Bill, so that we can all consider properly
the Government’s responses?
I thank my right hon. and learned Friend, and I do concur in that
respect. We have waited a very long time for this Bill, and we
have to get it right. I think we have waited too long for the
Bill, but that is the past—that is done. What we cannot do now is
rush things to such an extent that we cannot take everyone’s
views on board, and therefore I would concur. Basically, this has
to be a structure that survives, potentially for decades to come,
and is built on as we see challenges going forward, so I concur
with my right hon. and learned Friend.
(Edinburgh South West)
(SNP)
I thank the hon. Gentleman for his statement. I am very glad to
hear him acknowledge the importance of protecting freedom of
expression, but there is also the issue of anti-discrimination
law. On a number of occasions in this Chamber, I have raised the
problem that Twitter’s hateful conduct policy and its moderation
policy often discriminate against women by taking down women’s
tweets when they state biological facts and failing to take down
abusive and violent tweets directed at women. The reason for that
being that Twitter does not have sex as a protected
characteristic in its hateful conduct policy. This was raised by
the Joint Committee on Human Rights in a report a couple of years
ago, in which we recommended that Twitter should include sex as a
protected characteristic in its hateful conduct policy.
From my inquiries, it seems that Twitter thinks it is above the
domestic law of the United Kingdom when it comes to
anti-discrimination law, and it seems to be praying in aid a
loophole in the Equality Act 2010. I am not sure it is right
about that legally, but does the hon. Gentleman agree with me
that, if there is a loophole in the Equality Act that is letting
Twitter off the hook when it comes to our anti-discrimination
law, the Online Safety Bill would be a good opportunity to close
that loophole, so that Twitter and other service providers are
all subject to the anti-discrimination law of the United
Kingdom?
I thank the hon. and learned Lady for her comments, and I have a
great deal of sympathy for what she says. I am well aware that
she receives unwarranted and vile abuse at times for expressing
her views, and I think that is abhorrent in many respects. It
highlights in many regards the point I made earlier about the
social media companies being their own editors-in-chief and
effectively having their own content policies. That will be the
case going forward, but there needs to be oversight of those so
that they are compliant with the new law as it stands. One of our
recommendations is:
“We have proposed several amendments to the definition and scope
of harms covered by the regime that would bring the Bill into
line with the UK’s obligations to freedom of expression under
international human rights law.”
I hope that that recommendation would cover many of the aspects
to which the hon. and learned Lady is referring.
(Wellingborough) (Con)
A few years ago, on social media, there was a picture of my young
son being beheaded in an ISIS-type scenario. It was not really my
son in the picture, but the image represented my son. The
excellent Chairman of the Select Committee is right to say how
powerful Select Committees are. Would anything that the
Government are doing in the Bill have prevented that picture from
being put online, or have helped us find out who did that?
I have heard of this before from my hon. Friend. I am grateful
for the opportunity to express my deepest sympathy, shock and
anger at the vile, disgusting behaviour that he and his family
faced. The short answer to his question is: yes, if the Bill is
got right. That picture is a type of deepfake. The harassment
aspect is illegal; a case would have to be built around the
harassment aspect, so he would almost have to take this offline,
rather than deal with it as an online matter. The way to deal
with it online would be by baking in resources such as compliance
officers, and by writing it into the Bill that posting and
manipulating an image that is meant to do harm should be
considered an online harm, and therefore something for which
social media companies could be called to account. If the Bill is
crafted correctly, the egregious and disgusting use of vile
images of that kind would, I hope, be curtailed.
(Cardiff West) (Lab)
I thank my hon. Friend, the Chair of the Select Committee. I
confirm that all Members across the Committee are in firm
agreement with the recommendations in the report. Does he believe
that the Government should take particular note of recommendation
19 on designated compliance officers, and recommendation 28,
which says that the Government should scrap plans to introduce a
permanent joint committee to oversee online safety and digital
regulation? The latter idea seems to have come out of nowhere;
perhaps it was written on the back of a fag packet or came from a
weekend tweet—I do not know. Should the Government not abandon
that daft idea, and recognise that it is the proper duty of the
Select Committee to undertake that scrutiny?
For me, one of the attractions of compliance officers is that the
idea is based on the regime we have for financial services, which
has been one of the most successfully regulated industries,
certainly over the past 15 years since the financial crash. The
role of the compliance officer has been key to that. One good
thing about the proposal is that it is the social media companies
that would pay. Whenever social media companies see any form of
potential illegality, they push it to arm’s length; they push it
to the police, and expect the police to pick up the pieces. The
police do not have the resources to chase these things down, so
only exemplars get pulled up by the police. The companies should
be responsible, and should pay for their own policing.
Of course I agree with the point about recommendation 28. I would
like to think that the debate on that has shifted over time. The
Secretary of State was obviously expressing a genuine view. I
completely understand that view, and why it was expressed at that
juncture. However, the Joint Committee on the Draft Online Safety
Bill has perhaps run away with the suggestion a little bit, and
in so doing, has perhaps encroached on the good governance of
this place.
(Strangford) (DUP)
I thank the hon. Gentleman for his answers. Recent reports have
stated that the draft Online Safety Bill is neither clear enough
nor robust enough to tackle some forms of illegal and harmful
content. Responsibility for some of the most serious forms of
child sexual exploitation may be evaded. Will the Chair of the
Committee provide reassurances that tackling all forms of
illegal, harmful and exploitative content will be prioritised in
the Bill, so that we can protect young children, and many others
who are vulnerable?
The hon. Gentleman is absolutely correct to highlight that point.
There is an issue about content that is deliberately manipulated
in order to avoid moderation. Effectively, it is content that
just manages to evade the algorithms, but is there as a signpost
to abuse, or is a means of taking people off one platform and on
to another that is not a tier-1 platform and that may be less
regulated. It is crucial that that is clamped down on as soon as
possible, so that we can protect children in the way that he and
I—and, I am sure, all Members of the House—wish to do.
The Parliamentary Under-Secretary of State for Digital, Culture,
Media and Sport ()
Let me start by putting on record my thanks and the Government’s
thanks for the work that the Select Committee has done. We are
grateful for the time and attention that its members have given
to this important issue.
There is no question but that large social media firms have not
been prioritising safety and preventing harm, even in relation to
children. They have been prioritising profit instead of people,
and the time has come for Parliament to act. The legislation we
have tabled is groundbreaking; we will be one of the first
countries, if not the first country, in the world to take such a
step. The measures in the Bill, even as drafted, are very strong,
with fines of up to 10% of global revenue capable of being
levied, and personal liability for some senior executives in
certain circumstances.
I thank the Select Committee Chairman for his comments about
freedom of expression, which are of course important. There are
duties in the Bill as drafted requiring social media firms to
have regard to freedom of expression and, particularly, to
protect journalistic and democratic content. We are interested in
exploring with the Select Committee how we can go further in
those areas, and I look forward to appearing before it in a week
or two.
Let me finish by saying that we are very much in listening mode;
we have been digesting the reports of the Select Committee and
the Joint Committee very carefully. It is our intention to bring
forward an updated Bill in this Session so that it can have its
Second Reading. In preparing that updated Bill, we will continue
to work closely with the Committees and to listen carefully to
the views of Members of this House, including those expressed in
the session today and in the debate we had a week or two ago.
There is a great deal of wisdom on both sides of the House that
we can learn from, and it is our intention to do that as we bring
forward this groundbreaking piece of legislation designed to
protect our fellow citizens but particularly children.
I thank the Minister for his comments. He is very engaged in the
process and shows due respect to the Joint Committee and the
Select Committee both in terms of our work and through his
engagement. That is very welcome and is a reminder of times
past.
I welcome the Government’s listening mode. The message from both
sides of the House must be that we can all contribute as much as
possible and that this should not be about party lines. This
legislation is too important to get bogged down in issues such as
that, because it is about the protection of our society, our
democracy, our children and our mental health.
|