After years of deliberation, the UK passed its long-awaited Online Safety Bill (the “OS Bill”). It imposes content moderation requirements on certain online platforms and service providers to address illegal and harmful content.

The OS Bill reflects a recent trend to scrutinise online platforms’ and service providers’ operations, particularly their interaction with children. For example, the UK ICO has made children’s privacy a top enforcement priority and, in April 2023, issued a £127m penalty against TikTok Inc for inter alia failing to use children’s personal data lawfully. Similar trends exist in the EU. The Digital Services Act, for example, requires businesses to monitor and regulate illicit materials on their platforms, and regulators, including the Irish DPC, have issued numerous high profile fines for misuse of Children’s personal data.

Who is covered?

Broadly, the OS Bill applies to any business that: (i) hosts user-generated content (“UGC”); (ii) facilitates private or public online interaction between users; (iii) provides search engines; or (iv) delivers any service which publishes pornographic content (collectively, “Covered Entities”).

There are some specific exceptions to the OS Bill’s application, most notably:

  1. emails, SMS, and MMS services which are only UGC;
  2. limited functionality services which are solely user-to-user;
  3. one-to-one live aural communications; and
  4. user-to-user service and search service which are for internal business resources or tools.

Does it apply extraterritorially?

Yes, the OS Bill has broad extraterritorial scope, and applies to any business, irrespective of where it is located, if its content is accessible to UK-based users. The Regulatory Policy Committee estimate that 25,100 platforms will fall under the OS Bill’s scope.

When does it come into force?

There is currently no set date. The OS Bill is expected to be implemented in phases over a two-year period, aiming to become fully enforceable from mid-2025.

What are the key requirements?

The OS Bill is a wide-ranging, highly granular, and imposes different content moderation requirements, for child and adult users respectively, on different categories of Covered Entities.

Broadly, for child users, the OS Bill requires Covered Entities to:

  • remove illegal content quickly, or prevent it from being viewed in the first place. Illegal content is defined broadly and includes content that promotes or facilitates suicide or self-harm, controlling or coercive behaviour, hate crimes and terrorism;
  • prevent children from accessing harmful or age inappropriate content, which (for some Covered Entities) includes fraudulent adverts;
  • enforce age-limits on accounts through age-verification technology;
  • for large social media platforms, ensure that the risks and dangers posed to children are more transparent, including by publishing risk assessments; and
  • provide clear and accessible ways to report problems.

For adult users, the OS Bill requires Covered Entities to adopt measures to prevent their services being used for illegal activity, and to remove illegal content (which, for some Covered Entities, includes fraudulent adverts). Further, Category 1 Covered Entities (the largest and most-high risk) are also required to:

  • remove content that is banned by their own terms and conditions;
  • give adult users greater control over the content that they see, and who they engage with, on the platform or service; and
  • submit annual transparency reports to Ofcom.

While the OS Bill does not ban end-to-end encryption, or imposed any specific encryption-related requirements, it does indirectly address its use. For example, the OS Bill creates an offence for senior managers who, in response to an information notice, produce only a document which is encrypted.

Who will enforce the OS Bill?

The OS Bill will be overseen and enforced by Ofcom, the UK communications regulator.

What are the penalties?

The OS Bill gives a range of penalties, including:

  1. Fines: up to the higher of £18 million or 10% of global annual revenue;
  2. Criminal Action: Covered Entities, and their senior managers where demonstrably at fault, can be held criminally liable for failing to comply with information requests or certain enforcement notices.
  3. Alienation: with the agreement of the courts, Ofcom can require Covered Entities’ ancillary service providers, such as advertisers and internet service providers, to stop working with the company at fault, thereby cutting them off from the UK market.

Ofcom has, in the past, demonstrated a willingness to impose significant penalties on businesses under its regulatory purview. For example, in 2017, the regulator fined British Telecom £42m over delays to its high-speed cable installations and, in 2018, it fined Royal Mail £50m for competition law violations. Businesses should anticipate the possibility of Ofcom taking similarly robust enforcement action in this area.

What can you do now?

Online platforms and service providers can begin by assessing whether they are a Covered Entity and, if they are subject to the law, begin taking steps now to establish the internal processes and documentation needed to prepare for the OS Bill.

In particular, although the OS Bill will be implemented in phases, its illegal content requirements are expected to be enforced shortly after the law is implemented. Covered Entities may wish to start by considering what constitutes illegal content, how it will be identified, how it will be removed from the platform or service once identified, or how the platform or service will prevent such content from being generated in the first place. Covered Entities should also consider how they will document these processes. Ofcom expects to publish various guidance, including illegal content risk registers, shortly after their powers commence to assist with this process.

 

To subscribe to our Data Blog, please click here.

The cover art used in this blog post was generated by DALL-E.

Author

Robert Maddox is International Counsel and a member of Debevoise & Plimpton LLP’s Data Strategy & Security practice and White Collar & Regulatory Defense Group in London. His work focuses on cybersecurity incident preparation and response, data protection and strategy, internal investigations, compliance reviews, and regulatory defense. In 2021, Robert was named to Global Data Review’s “40 Under 40”. He is described as “a rising star” in cyber law by The Legal 500 US (2022). He can be reached at rmaddox@debevoise.com.

Author

Martha Hirst is an associate in Debevoise's Litigation Department based in the London office. She is a member of the firm’s White Collar & Regulatory Defense Group, and the Data Strategy & Security practice. She can be reached at mhirst@debevoise.com.

Author

Allegra De Lorenzo is a trainee associate in the Debevoise London office.