- Expert groups, children’s charities and free speech advocates
called on to help define who can make “super-complaints” about
online safety issues to Ofcom
- consultation will help determine the criteria for eligibility
and the procedure for super-complaints
- comes as part of the Online Safety Act which will introduce
new powers for regulator to keep the internet safe and protect
free
Children’s charities, free speech advocates and other groups
could, for the first time, raise online safety and freedom of
expression concerns directly to Ofcom through a
“super-complaint” under a proposal unveiled by the government
today (Thursday 16 November).
Under the Online Safety Act, social media companies have been
given new duties to protect children, enforce the promises they
make to users and remove illegal content, or they will face huge
fines from Ofcom.
Individuals will report harmful and illegal content to social
media platforms, and it is their responsibility to tackle this
within the law.
The super-complaints process is designed to help Ofcom stay on top of systemic
harmful trends and emerging threats by letting organisations,
such as charities and consumer groups, raise new concerns as soon
as they emerge. They will play an essential role in keeping the
internet safe by ensuring that Ofcom are made aware of issues
in a quick and reliable way, so it can take action in its new
role as a regulator of online platforms.
In a consultation published today, the government is seeking
views from expert groups to help define who can make
super-complaints, the conditions and format of a super-complaint,
and expectations on how Ofcom should respond to each
complaint.
For example, a super-complaint could notify Ofcom of a new social media
feature used on multiple services that subjects children to
harmful content, such as violent or pornographic images, or flag
that platforms are consistently failing to take down illegal
content they have a duty to remove.
Similarly, a campaign group may raise that a social media
platform’s content moderation systems are consistently removing
legal content that their terms of service don’t prohibit,
undermining freedom of expression on the platform.
, Secretary of State for
the Department for Science, Innovation and Technology, said:
The Online Safety Act makes the UK the safest place in the world
to be online, but we need to be sure Ofcom is ready to respond to
any emerging online safety issues as soon as they arise.
The super-complaints process will allow organisations to make
Ofcomaware of new
challenges quickly and efficiently, making sure the ambition and
promise of the Online Safety Act can keep pace with evolving
trends, protecting people online for decades to come.”
The detail of how this “super-complaints” process will work will
be informed by the consultation the government launched today,
which will ask for views on the super-complaints process and how
this can be managed swiftly and easily.
Organisations that will be allowed to raise super-complaints will
meet the criteria established in this consultation, ensuring that
the regulator is only made aware of legitimate, considered
complaints, allowing it to deal with said complaints in a timely
and effective manner, and establishing how Ofcomdeals with those
complaints.
Super-complaints, which are intended to help Ofcom stay on top of harmful
trends and emerging threats, may lead the regulator to act
against issues raised by using its new powers under the Online
Safety Act. This could include updating Codes of Practice or
investigating whether a particular service is complying with
the new law, or face fines that reach billions of pounds.
Super-complaints processes already exist in other sectors. For
example, the Competition and Markets Authority has a similar
process that allows certain consumer bodies to request that
regulators investigate markets or market practices that they
think are significantly harming the interests of consumers.
Gill Whitehead, Ofcom’s Online Safety Group
Director, said:
Protecting children and protecting free speech are key pillars of
the UK’s groundbreaking new online safety laws. Campaigners’
voices have helped lay the foundations, and we want to continue
hearing from them as we build a safer life online.
We’ve assembled a world-class a team so we can keep a close eye
on issues as they emerge, and we’ve already set out our first
blueprint for what tech firms need to do to tackle illegal harms.
But we won’t be doing this alone, and we’re looking forward to
working with a broad coalition of experts.
The Online Safety Act, which received Royal Assent on Thursday 26
October, makes Britain the safest place in the world to live and
work online. It protects children from online harm, empowers
adults to exercise greater control over what they see on social
media, and places legal responsibility on tech companies to
prevent and swiftly remove illegal content.
Today’s super-complaints
consultation launch follows the publication of Ofcom’s own major consultation
last week, that seeks to establish how tech firms will protect
their users from illegal harms online.