Tiktok, X, Google and Meta answer questions on online safety
TikTok, X, Google and Meta have each written to the Science,
Innovation and Technology Committee following their appearance in
front of the committee in February. The four technology
companies appeared in front of the committee as part of its inquiry
into social media, misinformation and harmful algorithms. Following
this, the committee's chair, Chi Onwurah, wrote to each company to
ask a series of follow-up questions. Today, the
committee...Request free trial
TikTok, X, Google and Meta have each written to the Science, Innovation and Technology Committee following their appearance in front of the committee in February. The four technology companies appeared in front of the committee as part of its inquiry into social media, misinformation and harmful algorithms. Following this, the committee's chair, Chi Onwurah, wrote to each company to ask a series of follow-up questions. Today, the committee publishes the four responses from the technology companies and provides a breakdown of their replies. Meta Regarding reports about leaked internal Facebook guidance which gave examples of offensive content – including racist statements – that could be allowed on the platform under the new policies, Meta's letter conceded that its recent Hateful Conduct Policy changes “could be interpreted” to allow such claims. Meta confirmed that there was “no immediate plan” to end the Third-Party Fact-Checking Program or roll out Community Notes in the UK – these are changes to its approach to misleading content that were announced in the US earlier this year. It said it would consider its legal and regulatory obligations, including under the Online Safety Act, before making any such changes in the UK. Meta did not give significant details on research or risk assessments it has carried out on the impact of any such policy in the UK – despite the committee's question – stating “There is no single process that we adopt for making appropriate changes to the Community Standards.” X X's reply offered some clarity on how the algorithm for the main feed and its Community Notes work, and how they differ. It explained that Community Notes do not work like many engagement-based ranking systems, instead using a “bridging algorithm” – for a note to be shown on a post, it needs to be found helpful by people who have tended to disagree with each other in their past ratings. X declined to give information on whether advertisers paused activity on X during the summer riots, describing this as “sensitive business information”. This is despite X's representative citing a “pause on advertising” in that period to the committee in February. The reply also said that as of September 2024, only 1,275 people at X worked in content moderation globally. This follows highly-publicised job cuts at the company. In its response, Google stated that it had demonetised the site Channel3Now – a website that posted false information about the Southport killings, claiming that the attacker was an asylum seeker who came to the UK on a boat – on 31 July, two days after the information appeared on the site. It did not answer the committee's questions about how much Google earned from ads on Channel3Now, or how much Channel3Now earned through Google, in this period. The letter did not respond to the committee's request for Google's reflections on how its advertising and monetisation systems may have contributed to the 2024 unrest, which it told the committee it had undertaken. Similarly, the letter didn't answer the committee's question of how much money Google-owned YouTube made from ads placed on eating disorder content online. Google did provide some detail on the measures it takes to prevent ads being displayed next to harmful content. TikTok TikTok declined to share some details of their recommender algorithm and trust and safety measures requested by the committee, citing commercial confidentiality, and the risk that malicious actors could use this information. Chair of the Science, Innovation and Technology Committee, Chi Onwurah MP, said: “Google, TikTok, X and Meta have provided us with follow up information related to their February appearance before the committee, giving us some more insight into how they moderated harmful content on their sites during last summer's riots. This unrest, fueled by online misinformation, showed the urgent need for tech companies to do their bit to keep UK citizens safe. “With this in mind, I was pleased to see Meta point to the Online Safety Act as a factor it'll consider in deciding whether to roll out changes in the UK to weaken its approach to content moderation, which it's already done in the US – an example of how legislation can influence the behaviour of big tech companies. “I was also glad to hear how X's bridging algorithm works for Community Notes and how it's intended to build trust by boosting notes supported by those with differing viewpoints. It raises the question of why X doesn't use a system like this, which the company states identifies content that is “healthier and higher quality”, more widely across the platform. “However, these letters don't answer all of the committee's questions. The tech companies declined to give full clarity on the monetisation of false and harmful content, advertiser activity and the algorithms used to recommend content. In our February evidence session, these four companies all agreed that they have a responsibility to be transparent and accountable before Parliament. Have they forgotten this, only two months later?” /ENDS
|