OFCOM: Online age checks must be in force from tomorrow
Tech firms must introduce age checks to prevent children from
accessing porn, self-harm, suicide and eating disorder content
Bluesky, Discord, Grindr, Reddit and X among latest firms to
commit to age-gating, while Ofcom lines up targets for enforcement
Sites and apps where children spend most time must make their feeds
safer Sites and apps which allow harmful content must prevent
children from...Request free trial
Sites and apps which allow harmful content must prevent children from accessing it from the end of this week, Ofcom has warned, as the deadline approaches for tech firms to comply with new rules. The changes mean that risky sites and apps – large and small – must use highly effective ‘age gating' methods to identify which users are children, and then prevent them from accessing pornography, as well as self-harm, suicide and eating disorder content. Services which allow porn Ahead of the 25 July deadline, change is already happening. Over the last month, the UK's biggest and most popular adult service providers – including Porn Hub – plus thousands of smaller sites have committed to deploying age checks across their services. This means it will be harder for children in the UK to access online porn than in any other OECD country. Other online platforms have now announced they will deploy age assurance – including Bluesky, Discord, Grindr, Reddit and X. [1] Ofcom is ready to enforce against any company which allows pornographic content and does not comply with age-check requirements by the deadline. Today we are extending our existing age assurance enforcement programme – previously focused on studio porn services – to cover all platforms that allow users to share pornographic material, whether they are dedicated adult sites or other services that include pornography. We will be actively checking compliance from 25 July and, should it be necessary, we expect to launch any investigations into individual services next week. These would add to 11 Ofcom investigations already in progress. Age checks to shield children from other harms Under our rules, sites that allow other forms of harmful content must also have highly effective age checks from 25 July. We are launching a new age assurance enforcement programme, building on work undertaken by our ‘small but risky taskforce,' to monitor the response from industry. This will specifically target sites dedicated to the dissemination of harmful content, including self-harm and suicide, eating disorder or extreme violence/gore. Protecting children on the most popular sites and apps As well as preventing children from encountering the most harmful content relating to suicide, self-harm, eating disorders and pornography, Ofcom's Codes also demand that online services act to protect children from dangerous stunts or challenges, misogynistic, violent, hateful or abusive material, and online bullying. Even where sites and apps do not technically allow these types of harmful material under their terms of service, Ofcom's research shows that such content can be all too prevalent. In particular, we know that content recommended in personalised ‘for you' feeds represents children's main pathway to encountering these harms. Our Codes are clear, among other things, that algorithms must be tamed and configured for children so that the most harmful material is blocked. To hold sites and apps to account, we are today launching an extensive monitoring and impact programme, primarily focused on the biggest platforms where children spend most time – including Facebook, Instagram, Roblox, Snap, TikTok and YouTube. This will include:
This activity is in addition to our ongoing action to enforce our illegal harms Codes, including measures to protect children from sexual abuse and exploitation online. Majority of UK parents supportive of children's safety measures New research[3] suggests that a majority of parents believe that the measures set out in Ofcom's Protection of Children Codes will improve the safety of children in the UK. Over seven in 10 (71%) feel that the measures overall will make a positive difference to children's safety online, while over three-quarters (77%) are optimistic that age checks specifically will keep children safer. Nine in 10 parents (90%) agree that it is important for tech firms to follow Ofcom's rules, but a significant minority (41%) are sceptical about whether tech firms will comply in practice. Dame Melanie Dawes, Ofcom's Chief Executive said: “Prioritising clicks and engagement over children's online safety will no longer be tolerated in the UK. Our message to tech firms is clear – comply with age-checks and other protection measures set out in our Codes, or face the consequences of enforcement action from Ofcom.” End Notes to editors:
|