Skip to main content

12.02.25

New duties herald a new age of online regulation in the UK and set the stage for a clash with US tech giants

2025 is set to be a busy year for providers of user-to-user (e.g. social media) and online search services in the UK.

The Online Safety Act 2023 (‘OSA’) – a new set of laws designed to protect children and adults online – received Royal Assent on 26 October 2023.  It is a significant piece of legislation, that has enjoyed cross-party support, and has been in the works since April 2019 when an ‘Online Harms White Paper’ first proposed a single regulatory framework to tackle online harms (which led to a draft Online Safety Bill in May 2021).

In 2024, various communication offences created by the OSA came into force (see our blog post here).  These largely relate to the conduct of individual online users.  In 2025, a number of new duties for service providers will come into force.  These require service providers to identify, mitigate and manage the risks of their services being used in illegal and harmful ways.  The new duties are aimed at increasing service providers’ responsibility for users’ safety on their platforms.

In this blog post, we outline which online services are regulated by the OSA and the new obligations to which they will be subject this year.

Which services are regulated by the OSA?

The UK Government estimates that around 25,000 businesses will be subject to the OSA, ranging from large social media platforms to smaller online marketplaces and dating apps.

The OSA regulates the following services:-

  1. User to user services – these are internet services through which user-generated content may be encountered by other users. Many online platforms fall within this broad definition, including social media platforms (e.g. Facebook and X), messaging services (e.g. WhatsApp and Facebook Messenger), video-sharing services (e.g. TikTok and YouTube), marketplace and listing services (e.g. eBay and Amazon), and file-sharing services (e.g. Microsoft OneDrive and Google Drive).
  2. Search services – these are internet services that are, or include, a search engine (defined as a functionality that enables a person to search more than one website or database). This includes general search services such as Google and Microsoft Bing, as well as product-specific search services such as Skyscanner and Comparethemarket.
  3. Internet services that publish or display pornographic content.

A service will be regulated by the OSA if it has ‘links with the UK’ and is not exempt, even if the provider of the service is based overseas.  A service will have ‘links with the UK’ where it has a significant number of UK users, if the UK is a target market, or it is capable of being accessed by UK users and there is a material risk of significant harm to such users.

What are the new duties of online service providers?

The new OSA obligations for in-scope service providers rely heavily on secondary legislation and codes of practice, and are being implemented in phases.  Ofcom (the regulator for online safety) has published a roadmap for the OSA’s phased implementation and its objectives for 2025 here.

Phase 1 – Illegal harms

On 16 December 2024, Ofcom published its first OSA code of practice and guidance, “firing the starting gun on the first set of duties for tech companies”.  All in-scope services are now required to conduct assessments of the risk of illegal content on their platforms by 16 March 2025.

Subject to Parliamentary approval of Ofcom’s codes of practice, from 17 March 2025, service providers will need to implement the recommended measures set out in Ofcom codes (or other effective measures to protect users from illegal content).

‘Illegal content’ is defined in the OSA as ‘content that amounts to a relevant offence’, which includes priority and non-priority offences.

Priority offences (the most serious offences) relate to (a) terrorism, (b) harassment, stalking, threats and abuse, (c) coercive and controlling behaviour, (d) hate offences (e.g. hatred based on race, religion, sexual orientation etc), (e) intimate image-based abuse (‘revenge porn’), (f) extreme pornography, (g) child sexual exploitation and abuse, (h) sexual exploitation of adults, (i) unlawful immigration, (j) human trafficking, (k) fraud and financial offences, (l) proceeds of crime, (m) assisting or encouraging suicide, (n) drugs and psychoactive substances, (o) weapons offences, (p) foreign interference, and (q) animal welfare.  User-to-user service providers will be required to take steps to prevent users encountering content amounting to a priority offence, and search service providers will need to minimise the risks of users encountering such content.

Non-priority offences are existing offences under UK law, which are not a priority offence and which meet other certain criteria (for example, they do not include offences created by the OSA or offences concerning IP infringements).  The OSA requires all service providers to swiftly take down content which amounts to a non-priority offence.

Phase 2 – Child safety, pornography and the protection of women and girls

On 16 January 2025, Ofcom published its Age Assurance and Children's Access Statement under the OSA, alongside guidance for regulated services in implementing effective age verification measures.

As of 17 January 2025, publishers of pornographic material must implement effective age checks to prevent children from encountering pornography.

By 16 April 2025, service providers must assess whether their service is likely to be accessed by children. Ofcom is then due to publish its Protection of Children Codes of Practice and risk assessment guidance in April 2025.  Services likely to be accessed by children will be required to carry out a children’s risk assessment within three months (i.e. by July 2025).

In February 2025, Ofcom expects to publish draft guidance on online services’ role in protecting women and girls.  This will include recommended measures for service providers to tackle the abuse that women and girls disproportionately face online, in respect of illegal content such as harassment, stalking, controlling or coercive behaviour, extreme pornography, and revenge pornography.

Phase 3 – Categorisation and additional duties for categorised services

The final stage of implementation will focus on additional obligations for a small proportion of regulated services that will be designated Category 1, 2A or 2B services.

The highest risk platforms (being the social media and pornography sites with the largest number of users) will be designated as category 1 and will bear the highest duty of care.  Category 2A will contain the highest-risk search engines, such as Google and Bing, and category 2B will contain the remaining high-risk and high-reach sites.

Based on thresholds approved by Parliament, Ofcom expects to publish a register setting out the categorised services in summer 2025 and will publish draft proposals for the additional duties of those services by early 2026.

Enforcement

Ofcom has made clear that it is ready to use its enforcement powers in the event of non-compliance with the OSA.  Ofcom has the power to fine companies up to the greater of £18m, or 10% of qualifying worldwide revenue, and can apply for a court order to block an online platform in the UK.  Senior executives and managers could also face criminal prosecution for certain offences created by the OSA.

Comment

The OSA is a landmark piece of legislation for online service providers, particularly at a time when tech companies are pushing back on online censorship (and/or being placed under political pressure to do so).  X (formerly Twitter) led the way in this regard, following its acquisition by Elon Musk.  In January 2025, Meta announced a series of changes intended to “dramatically reduce the amount of censorship” and “restore free expression” on its platforms, such as removing independent factcheckers.

Against this backdrop, one wonders how Messrs Musk and Zuckerburg (and President Trump for that matter) would respond to telephone number-sized fines being imposed on tech giants who had failed to comply with duties imposed on them in the UK.  If the letter of the law were enforced, it is not beyond the realms of possibility that certain online platforms would simply cease to operate (or be allowed to operate) in the UK - much like when Google threatened to pull out of the Australian market in 2021 over a new law making tech giants pay for news shared on their platforms.

It is hoped that the OSA will afford better protection to victims of online harms.  However, it remains to be seen whether the scheme will facilitate the UK Government’s objective of making the UK “the safest place in the world to be online”, at a time when steps taken to tame the wild west of the internet are actively being reversed elsewhere.

 


Share


Legal Disclaimer

Articles are intended as an introduction to the topic and do not constitute legal advice.