Covid ‘infodemic’ – a clear message to Government that it must tackle online harms as a matter of urgency, say MPs
|
Online misinformation about Covid-19 was allowed to spread
virulently across social media without the protections offered by
legislation, promised by the Government 15 months ago. The
Misinformation in the COVID-19 Infodemic Report detailsevidence on
a range of harms from dangerous hoax treatments to conspiracy
theories that led to attacks on 5G engineers. The Online Harms
White Paper, published in April 2019, proposed a duty of care on
tech companies and an independent Online...Request free trial
Online misinformation about Covid-19 was allowed to spread virulently across social media without the protections offered by legislation, promised by the Government 15 months ago. The Misinformation in the COVID-19 Infodemic Report detailsevidence on a range of harms from dangerous hoax treatments to conspiracy theories that led to attacks on 5G engineers. The Online Harms White Paper, published in April 2019, proposed a duty of care on tech companies and an independent Online Harms Regulator, both key recommendations from the predecessor DCMS Committee. MPs voice new concerns that the delayed legislation will not address the harms caused by misinformation and disinformation – a serious omission that would ignore the lessons of the Covid crisis. The Report finds that tech companies use business models that disincentivise action against misinformation while affording opportunities to bad actors to monetise misleading content. As a result the public is reliant on the good will of tech companies or the ‘bad press’ they attract to compel them to act. The DCMS Committee calls for the Government to make a final decision on the appointment of the regulator now. Julian Knight MP, Chair of the DCMS Committee, said: “We are calling on the Government to name the Regulator now and get on with the ‘world-leading’ legislation on social media that we’ve long been promised. “The proliferation of dangerous claims about Covid-19 has been unstoppable. The leaders of social media companies have failed to tackle the infodemic of misinformation. Evidence that tech companies were able to benefit from the monetisation of false information and allowed others to do so is shocking. We need robust regulation to hold these companies to account. “The coronavirus crisis has demonstrated that without due weight of the law, social media companies have no incentive to consider a duty of care to those who use their services.” ENDS Evidence consistently emphasised a loss of trust in institutions with the pandemic giving opportunity for state actors including Russia, China and Iran to spread to spread false news, financial scams and malicious content. Others shared misleading or false information with well-meaning intentions. The Misinformation in the COVID-19 Infodemic report finds:
Key recommendations: ·Government should publish draft legislation - in part or in full - alongside the full consultation response to the White Paper this autumn if a finalised Bill is not ready ·Urges the Government to finalise the Regulator now. Notes Ofcom’s expedited work on misinformation in other areas of its remit in this time of crisis as arguments in its favour
·Ministers should set out a comprehensive list of harms in scope for online harms legislation, rather than allowing companies to do so themselves or to set what they deem acceptable through their terms and conditions. The Regulator should have the power instead to judge where these policies are inadequate and make recommendations accordingly against these harms ·Government must empower the Regulator to go beyond ensuring that tech companies enforce their own policies, community standards and terms of service, but also ensure that these policies themselves are adequate in addressing the harms faced by society ·The Regulator should be empowered to hand out significant fines for non-compliance. It should also have the ability to disrupt the activities of businesses that are not complying, and ultimately to ensure that custodial sentences are available as a sanction where required ·Government not the Regulator should bring forward evidence-led process to decide which harms would be covered by legislation. Clearly differentiated expectations of tech companies for illegal content and ‘legal but harmful’ content should also be established ·Call for Government to urgently develop voluntary code of practice to protect citizens from harmful impacts of misinformation and disinformation prior to legislation A full list of recommendations can be found in the report Impact on public health: MPs heard that the infodemic of misinformation about Covid led to people using harmful remedies such as ingesting disinfectant. A consultant anaesthetist at a leading London hospital described the impact of misinformation as huge, primarily as a result of late presentation by patients because they did not want to come to hospital. Conspiracy theories: Written evidence from BT reported thirty separate attempts of sabotage on the UK's digital infrastructure over one month with an estimated eighty attacks across sites operated by all four mobile networks, nineteen near critical infrastructure such as fire, police and ambulance stations. EE personnel and subcontractors faced 70 separate incidents, including "threats to kill and vehicles driven directly at staff". Name the Online Harms Regulator: MPs are calling for the Government to name the Regulator against evidence of a slipping timeline for the introduction of online harms legislation. Assurances given in evidence by the Minister for Digital that legislation would appear alongside the final consultation response due this this autumn was apparently contradicted by her answer to a parliamentary question in which she indicated instead that legislation would follow in this session. In its initial consultation response published in February 2020 the Government announced that it was "minded" to name Ofcom as a proposed new 'Online Harms Regulator'. Evidence presented suggests that the turn to public service broadcasting has been underpinned by the robust regulation of Ofcom. MPs note Ofcom’s expedited work on misinformation in other areas of its remit in this time of crisis as arguments in favour of its appointment as Online Harms Regulator. MPs note that any continued delay in naming an online harms regulator would bring into question how seriously the government was taking this crucial policy area. Social Media Companies: Representatives of Facebook, Google and Twitter were recalled to give evidence to the Online Harms and Disinformation Sub-committee because MPs were ‘dissatisfied’ with their answers. The Report is critical of the Government’s intention for the "essence" of future online harms legislation to be constructed around holding social media companies to their own terms and conditions. The content of companies’ policies and how they were applied are cited as areas where social media companies must improve to effectively tackle misinformation and disinformation or other harms such as hate speech and threats of violence. In several instances, these policies were found to be not fit for purpose, a fact that was seemingly acknowledged by the companies themselves. Tech companies are criticised for inconsistencies and slowness in response. For example, in one incident Google took action to demonetise a YouTube livestream of a London Real interview with David Icke. Google donated its share of the revenue from the broadcast to charity, however later confirmed that following a London Live radio broadcast of the same interview, found by Ofcom to be in breach of regulations, the hosts were allowed to keep revenue generated from Super Chats - a monetisation function allowing viewers to donate in exchange for a message appearing on the livestream. Monetisation: Engagement with conspiracy theories and false news online was found to incentivise platforms to continue surfacing similar content from spreaders of misinformation, theoretically encouraging users to continue using the platform, allowing more data to be collected and more adverts displayed. Companies also allowed spreaders of misinformation to monetise their content, to the benefit of both platform and publisher. Tech company tools: The Report acknowledges positive technological innovations introduced by social media companies to counteract misinformation such as Facebook’s 'correct the record' tool alongside warning labels. However the corrective tool overlooked the majority of people who might have been exposed to misinformation, while Twitter's labelling was found to be inconsistent. MPs expressed concern that social media companies were unable to explain why shortcomings could not be addressed. Government role: The Government is urged to publish its media literacy strategy alongside its response to the Report with evidence thatmeasures such as digital citizenship and literacy education have a key role in mitigating the impact of misinformation. MPs question the Government's focus on factchecking whichduplicates the work of other organisations with professional expertise in the area. The Report notes the Committee had raised concerns with the Secretary of State in March about Government delays in establishing the Counter Disinformation Unit despite the fact that false narratives had been spreading uncontrollably since January. |
