6.05.25
Ofcom publishes final guidance on protecting children online
According to Ofcom, three in five teenagers have seen harmful content online in a four-week period. Children describe seeing violent content online as “unavoidable” and say that content relating to suicide, self-harm and eating disorder content as "prolific".
It is apt timing then that Ofcom has published its final guidance for online service providers to meet their duties regarding children’s online safety under the Online Safety Act 2023 (‘the Act’). As explained in our blog post here, the Act established a regulatory framework for providers of online services (such as social media platforms, search engines, dating apps, and pornography sites) to address illegal content and protect users from online harms. The new obligations are being implemented in phases and rely heavily on secondary legislation and codes of practice by Ofcom.
What has Ofcom published?
On 24 April 2025, Ofcom published its Protection of Children Codes of Practice (‘the Codes’) and guidance in a policy statement which comprised of six volumes.
The Codes focus on user-to-user and search service providers preventing minors from encountering harmful content relating to suicide, self-harm, eating disorders and pornography. Service providers must also act to protect children from misogynistic, violent, hateful or abusive material, online bullying and dangerous challenges.
The Codes (which are subject to Parliamentary approval) build on the rules already in place to protect all users from illegal harms. They include 40 practical measures for protecting children online, including:-
- Effective age checks. The highest risk services must use “highly effective” age checks to identify which users are children. Children may be prevented from accessing the entire site or app, or just certain content. Age-checking mechanisms are a focus for Ofcom, which notably found, for instance, that the average age at which children first saw online pornography is 13.
- Safer feeds. Providers operating a personalised recommendation system which poses a medium or high risk of harmful content must configure their algorithms to filter out harmful content from children’s feeds.
- Fast action. All sites and apps must have processes in place to review, assess and quickly tackle harmful content upon becoming aware of it.
- Providers are required to give children more control over their use of the site or app (for example, allowing them to indicate what content they don’t like, to accept or decline group chat invitations, to block and mute accounts, and to disable comments on their posts).
- Support. Platforms must provide supportive information for children who may have encountered or searched for harmful content.
- Easy reporting and complaints. Children must find it straightforward to report content or complain, and providers should respond appropriately. Terms of service must be clear so that children can understand them.
- Strong governance. All services must have a named person accountable for children’s safety. A senior body should annually review the management of risk to children.
What happens next?
In-scope user-to-user and search service providers were required to complete their children’s access assessments by 16 April 2025. Where children are likely to access the service, those providers now have until 24 July 2025 to complete children’s risk assessments. Subject to the Codes receiving Parliamentary approval, from 25 July 2025, service providers will need to take the measures set out in the Codes (or use other effective measures to protect child users from harmful content).
Ofcom has announced that it will be shortly publishing a consultation on additional measures, including to:-
- ban the accounts of people found to have shared child sexual abuse material;
- introduce crisis response protocols for emergency events;
- use hash matching to prevent the sharing of non-consensual intimate imagery and terrorist content;
- tackle illegal harms, including child sexual abuse material, through using AI;
- use highly effective age assurance to protect children from grooming; and
- set out the evidence surrounding livestreaming and make proposals to reduce its particular risks to children.
Ofcom has said it is “ready to take enforcement action if providers do not act promptly to address the risks to children on their services.” If providers fail to comply with their duties, Ofcom has the power to impose fines and, in very serious cases, apply for a court order to prevent the site or app from being available in the UK.
We expect that Ofcom will be willing to use its enforcement powers, particularly given children’s online safety is a high-profile issue at the moment (for instance, following the success of the Netflix drama, Adolescence, about a 13-year-old boy radicalised by online misogyny).
Whilst there are concerns that ongoing trade deal discussions between the UK and US could result in the watering down of the Act, parliamentary under-secretary for online safety Baroness Jones of Whitchurch told MPs on 29 April 2025: “The Prime Minister has made it absolutely clear that the Online Safety Act is not up for negotiation”.
Legal Disclaimer
Articles are intended as an introduction to the topic and do not constitute legal advice.