Skip to main content

HFW

In this section

Briefings

UK Online Safety Act 2023: New regulatory framework to ensure online safety

The UK’s Online Safety Act 2023 (the Act) received Royal Assent on 26 October 2023, and the majority of the Act’s provisions entered into force on 10 January 2024.1

The Act establishes a new regulatory framework to tackle illegal and harmful content online.2 The Act is intended to “make the UK the safest place in the world to be online”.3 The growth of social media platforms means that illegal and harmful content may have a greater impact on users. The Act impacts the largest tech companies and many websites and apps which provide user-to-user services, such as a discussion forum or chat room, or search services for other websites or databases.

For example, an organisation will provide a regulated user-to-user service if it has a website that has a chat function or discussion forum. Additionally, a website that allows users to comment substantively4 on another user’s comment will also be deemed a regulated user-to-user service.

The Act has had a long journey to Royal Assent, caused by various delays and several leadership changes within the governing Conservative Party. In addition, the controversial nature of the Act meant that it was subject to extensive scrutiny and debate in the House of Commons and the House of Lords.5 The Act seeks to strike a balance between freedom of speech and ensuring online safety.

Key changes

The Act has created a new regulatory framework for online safety in the UK and has appointed the Office of Communications (Ofcom) as the regulator for online safety. Ofcom is required to provide guidance and set out codes of practice which describe recommended measures that providers of regulated services should take to comply with their duties under the Act.

Prior to the Act, most user-to-user and search services operating in the UK were not subject to any regulation concerning user safety.6 The previous regulatory framework for the regulation of internet services was primarily set out in the EU’s Electronic Commerce Directive 2000, which was implemented in the UK by The Electronic Commerce (EC Directive) Regulations 2002. The liability for internet services under the Directive is limited. An information society service7 is not liable for any unlawful content it ‘hosts’ (i.e. stores) where the provider does not have knowledge of the content or was not aware of facts or circumstances from which it would have been apparent that it was unlawful.8 If the provider does have such knowledge or awareness, it is not liable where it acts expeditiously to remove or disable access to the content.9 Additionally, the ‘mere conduit’ exemption provides that an information society service is not liable for information transmitted on its communication network if it (a) did not initiate the transmission, (b) did not select the receiver of the transmission, or (c) did not select or modify the information contained in the transmission.10

The Act greatly increases the responsibility of providers of regulated services for the content on their platforms.11 The Act imposes numerous duties on them to identify, mitigate and manage the risks of harm from illegal content and harmful content.

The Act also introduces various communication offences which apply to individuals and companies, such as false communications offences, threatening communications offences and offences of encouraging or assisting self-harm.

Which services are regulated by the Act?

Ofcom’s “initial analysis suggests more than 100,000 online [internet] services could be subject to the new rules” including “organisations ranging from very large and well-resourced companies to small and micro-businesses, in a wide range of sectors”.12 An ‘internet service’ is a “service that is made available by means of the internet”, including where the service is made available by means of a combination of the internet and an electronic communications service.13 The Act covers: (a) user-to-user services; (b) search services; and (c) internet services that publish or display pornographic content, which we have not considered further in this briefing.

User-to-user services

User-to-user services are internet services by means of which content generated directly on the service by a user, or uploaded to or shared on the service by a user, may be encountered by another user of the service. 14 It does not matter whether content is in fact shared with other users. The service just needs to have a functionality that allows sharing.15 Additionally, it is irrelevant what proportion of content on a service is user-generated content that may be encountered by another user.16

Content is defined widely as “anything communicated by means of an internet service, whether publicly or privately, including written material or messages, oral communications, photographs, videos, visual images, music and data of any description”.17

Schedule 1 sets out user-to-user services which are exempt from the Act, including services where the only user-generated content enabled by the service is emails, SMS/MMS messages, one-to-one live aural communications, internal business services18 or services provided by public bodies. Additionally, the Act does not apply to services which are limited to enabling users to comment on and review provider content. The Act also excludes services which enable users to communicate by applying ‘like’ or ‘dislike’ buttons, emojis, voting and ratings to user content. If a user-to-user service includes a public search engine19, it is referred to as a ‘combined service’.20

Many large tech companies and online platforms fall under the definition of user-to-user services, including:

  • social media platforms, such as Facebook and X (formerly known as Twitter);
  • messaging services, such as WhatsApp and Facebook Messenger;
  • video-sharing services, such as TikTok and YouTube;
  • marketplaces and listing services, such as Amazon and eBay; and
  • file-sharing services, such as Microsoft OneDrive and Google Drive.

The wide definition of user-to-user services means that many other websites and apps will also be regulated by the Act. For example, an organisation will provide a regulated user-to-user service if it has a website that has a chat function or discussion forum. Additionally, a website that allows users to comment substantively 21 on another user’s comment will also be deemed a regulated user-to-user service. Consequently, organisations should carefully consider any decisions to amend their online services that would bring them within the scope of the Act.

Search services

A ‘search service’ is an internet service that is, or includes, a search engine.22 A ‘search engine’ includes a service or functionality that enables a person to search some or all websites or databases. It does not include a service which enables a person to search only one website or database.23

The Act’s definition of search service encompasses general search services which enable users to search the contents of the web, such as Google Search and Microsoft Bing. It also includes vertical search services which enable users to search for specific products or services offered by third party operators, such as Skyscanner and Comparethemarket.24

Links with the UK

User-to-user and search services are regulated by the Act where the services have links with the UK and are not exempt under Schedules 1 or 2.25 Such services have links with the UK if:

  • the service has a significant number of UK users;
  • UK users form one of the target markets for the service, or are the sole target market; or
  • the service is capable of being used in the UK by individuals, and there are reasonable grounds to believe that there is a material risk of significant harm to such individuals presented by user-generated content on the service or by search content of the service.26

This means that the Act is applicable to international companies. It does not matter where a company is based as long as its services are accessible by UK users.27

Who is a provider of a regulated service?

The ‘provider’ of a user-to-user or search service is the entity that has control over who can use the user-to-user part of the service or has control over the operations of the search engine.28 If no entity has such control, the provider is treated as being an individual or the individuals that have such control.29

The provider of a combined service is the entity that has control over both (a) who can use the user-to-user part of the service, and (b) the operations of the search engine.30 Again, if no entity has such control, the provider is treated as being an individual or the individuals that have such control.31

Duties of providers of regulated services

The Act imposes duties on providers of regulated services which differ based on the categorisation of the service. Providers of regulated services will be subject to additional duties if the service is categorised as a Category 1, 2A or 2B service (see below), and/or if the service is likely to be accessed by children (a person under the age of 1832).

Duties of care

Section 3 of the Act imposes the following duties of care on providers of regulated user-to-user and search services:

  1. Illegal content risk assessment duties. Providers have a duty to carry out a suitable and sufficient illegal content risk assessment.33 Providers also have a duty to take appropriate steps to keep the risk assessment up to date.
  2. Content reporting duties. Providers have a duty to operate the service using systems and processes that allow users and affected persons to report illegal content easily.34
  3. Duties regarding complaints procedures. Providers have a duty to operate a complaints procedure in relation to a service that: (a) allows for relevant kinds of complaint to be made; (b) provides for appropriate action to be taken by the provider of the service in response to complaints; and (c) is easy to access, easy to use and transparent.35
  4. Duties regarding freedom of expression and privacy. Providers must have regard to the importance of protecting users’ rights to freedom of expression and privacy when deciding on and implementing safety measures and policies.36
  5. Duties regarding record-keeping and review. Providers have various duties to make and keep written records.37

Providers of regulated user-to-user and search services are also subject to duties regarding illegal content. The duties are largely similar for both types of services. For example:

  • Providers must take proportionate measures to mitigate and manage the risk of harm to individuals effectively.
  • Providers must include provisions in a term of service (for user-to-user services) or a publicly available statement (for search services) that specify how individuals are to be protected from illegal content. They must apply these provisions consistently. The provisions of these documents must be clear and accessible.

However, there are some differences between the respective obligations of providers of user-to-user and search services:

  • Providers of regulated user-to-user services must take proportionate measures to prevent individuals from encountering priority illegal content38 by means of the service.39 However, providers of regulated search services must operate the service using proportionate systems and processes designed to minimise the risk of individuals encountering priority illegal content or other illegal content that the provider knows about.40 ‘Minimising the risk’ is less stringent than ‘preventing’ individuals from encountering illegal content.
  • Providers of regulated user-to-user services have duties to minimise the length of time for which any priority illegal content is present on the service, and swiftly to take down illegal content where the provider becomes aware of its presence.41

Given the wide scope of impacted services and the extent of the duties imposed by the Act, online platforms may be exposed to a variety of claims. For example, if a user of WhatsApp receives a message that contains priority illegal content, that user may bring a claim against WhatsApp for breach of contract (depending on the terms of service) or breach of statutory duty for failing to prevent the user from encountering such content.

Additional duties for Category 1, 2A and 2B services

Whether a service is categorised as a Category 1, 2A or 2B service will depend on whether the service meets threshold conditions which will be specified in secondary legislation. If Ofcom decides that a service meets the threshold conditions, it must add the service to the register of categorised services. 42 Ofcom expects that “most of the 100,000 in-scope services will not be categorised”.43

Category 1 services will be subject to more onerous duties than Category 2A and 2B services. For example, Category 1 service providers will need to consider the importance of freedom of expression when moderating content of democratic importance and journalistic content.44 They will also need to include features for adult users to use or apply to increase the control over the content they see.45

Additional duties for services likely to be accessed by children

Providers of regulated user-to-user and search services that are likely to be accessed by children have additional obligations. 46 A service is “likely to be accessed by children” in three cases47:

  1. Where a children’s access assessment48 concludes that it is possible for children to access the service, or a part of it, and the child user condition is met in relation to the service or the part of the service that it is possible for children to access. The ‘child user condition’ is met if (a) there is a significant number (in proportion to the total number of UK users) of children who are users of the service or that part of it, or (b) the service, or that part of it, is of a kind likely to attract a significant number of users who are children.49
  2. Section 36 places a duty on providers of a regulated user-to-user, search or combined service to carry out their first children’s access assessment.50 Where the provider of the service fails to do so, the service will be treated as likely to be accessed by children from the date by which the assessment was required to have been completed.51
  3. Where Ofcom determines that a service should be treated as likely to be accessed by children.

Enforcement

Ofcom may issue fines of £18 million or 10% of worldwide annual revenue, whichever is higher, for breaches of the Act. 52 Where two or more entities are jointly and severally liable for a penalty (for example, parent and subsidiary companies), the maximum fine will be £18 million or 10% of the group’s worldwide annual revenue, whichever is higher.53 Ofcom also has the power to impose a restriction order on an ancillary service requiring the content of the service that is in breach to be made unavailable.54 This will allow Ofcom to enforce the Act against providers of regulated services who are not based in the UK.

Ofcom also has powers to request information from (a) providers of regulated services, (b) a person who provides an ancillary service, (c) a person who provides an access facility, such as an app store, (d) a person who was a provider of a regulated service or provided an ancillary service or access facility at a time to which the required information relates, and (e) a person who appears to Ofcom to have, or be able to obtain, information required by Ofcom.55 It is an offence under section 109 to fail to comply with a requirement of an information notice. Further, section 110 provides that a senior manager can commit such an offence.

Super-complaints

Eligible entities56 may make a complaint to Ofcom that any feature of a service or conduct of a provider of a regulated service presents a material risk of: (a) causing significant harm to users or members of the public; (b) significantly adversely affecting the right to freedom of expression of users or members of the public; or (c) having another significant adverse impact on users or members of the public.57

A complaint that relates to a single service or provider of a regulated service will only be admissible if Ofcom considers that the complaint is of particular importance or relates to a particularly large number of users or members of the public.58

Ofcom’s plans for implementing the Act

Ofcom published updated details of its approach to implementing the Act on the day that the Act received Royal Assent.59 Ofcom will provide this guidance in the following three phases:

  • Illegal harms duties – Ofcom published for consultation draft codes of practice and guidance on illegal harms duties on 9 November 2023.60 Ofcom plans to publish a statement on its final decisions in Autumn 2024.
  • Child safety, pornography and the protection of women and girls – Ofcom published for consultation its draft guidance on age assurance and other duties on 5 December 2023.61 Ofcom will publish for consultation draft codes of practice relating to protection of children in Spring 2024. Ofcom expects to publish draft guidance on protecting women and girls by Spring 2025.
  • Transparency, user empowerment and other duties on categorised services – This phase will concern the additional requirements that fall on regulated services that are designated as Category 1, 2A or 2B services. Ofcom plans to publish a consultation on draft transparency guidance in mid-2024. Ofcom plans to advise the government on the threshold conditions in early 2024.62 The Secretary of State will then specify the threshold conditions in secondary legislation. If the secondary legislation is published by Summer 2024, Ofcom intends to publish a register of categorised services by the end of 2024.63

Next steps

The Act impacts a wide range of online services from the world’s largest tech companies, such as Amazon and Facebook, to other websites and apps which provide a discussion forum or chat room or facility to search other websites or databases.

Providers of regulated services should:

  1. familiarise themselves with the changes introduced by the Act;
  2. review their platform design for risks and harmful content;
  3. identify and protect vulnerable users, such as children;
  4. assess how users make reports or complaints;
  5. review and test the platform’s safety measures;
  6. keep up to date with Ofcom’s guidance and codes of practice;
  7. appoint a person responsible for online safety;
  8. ensure that employees know how to keep users safe on the platform.64

Users should also familiarise themselves with the Act’s changes as they may be able to bring claims against platforms like WhatsApp for breach of contract or breach of statutory duty if they encounter illegal content.

Ofcom is currently consulting on its codes of practice and guidance on illegal harms duties. The consultation is open for responses by email to  IHconsultation@ofcom.org.uk, until 17:00 on Friday 23 February. Further details can be found on the consultation webpage.

Ofcom is also consulting on its draft guidance on age assurance and other duties. The consultation is open for responses by email to  Part5Guidance@ofcom.org.uk, until 17:00 on Tuesday 5 March. Further details can be found on the consultation webpage.

Footnotes

  1. The Online Safety Act 2023 (Commencement No. 2) Regulations 2023, Paragraph 2
  2. 210285en.pdf (parliament.uk)
  3. UK children and adults to be safer online as world-leading bill becomes law - GOV.UK (www.gov.uk)
  4. “Substantively” means that the user can write a comment on another user’s comment as opposed to merely expressing a view on a comment by applying a ‘like’ or ‘dislike’ button, emojis, yes/no voting or rating.
  5.  UK children and adults to be safer online as world-leading bill becomes law - GOV.UK (www.gov.uk)
  6. 210285en.pdf (parliament.uk)
  7. An “information society service” is “any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services”
  8. Electronic Commerce Directive, Article 14(1)(a)
  9. Electronic Commerce Directive, Article 14(1)(b)
  10. Electronic Commerce Directive, Article 12(1)
  11. How the UK’s Online Safety Bill could transform the internet | World Economic Forum (weforum.org)
  12. Ofcom’s approach to implementing the Online Safety Act
  13. Online Safety Act 2023, section 228(1)-(2)
  14. Online Safety Act 2023, section 3(1)
  15. Online Safety Act 2023, section 3(2)(a)
  16. Online Safety Act 2023, section 3(2)(b)
  17. Online Safety Act 2023, section 236(1)
  18. The exemptions for internal business services are detailed in Schedule 1, paragraphs 7 and 8. Schedule 8, paragraph 7(2) sets out the conditions for a user-to-user or search service to be an internal business service.
  19. ‘Public search engine’ is defined by section 4(7) and Schedule 1, paragraph 7(2).
  20. Online Safety Act 2023, section 4(7)
  21. “Substantively” is defined in footnote 4.
  22. Online Safety Act 2023, section 3(4)
  23. Online Safety Act 2023, section 229
  24. Volume 1 (ofcom.org.uk)
  25. Online Safety Act 2023, section 4(2)
  26. Online Safety Act 2023, sections 4(5)-(6)
  27. A guide to the Online Safety Bill - GOV.UK (www.gov.uk)
  28. Online Safety Act 2023, sections 226(2) and (4)
  29. Online Safety Act 2023, sections 226(3) and (5)
  30. Online Safety Act 2023, section 226(6)
  31. Online Safety Act 2023, section 226(7)
  32. Online Safety Act 2023, section 236(1)
  33. Online Safety Act 2023, sections 9 and 26
  34. Online Safety Act 2023, sections 20 and 31
  35. Online Safety Act 2023, sections 21 and 32
  36. Online Safety Act 2023, sections 22 and 33
  37. Online Safety Act 2023, sections 23 and 34
  38. ‘Priority illegal content’ is defined in section 59(7) as: (a) terrorism content; (b) CSEA (child sexual exploitation and abuse offences) content; and (c) content that amounts to an offence specified in Schedule 7. Schedule 7 lists numerous priority offences, such as assisting suicide, human trafficking and proceeds of crime.
  39. Online Safety Act 2023, section 10(2)(a)
  40. Online Safety Act 2023, section 27(3)
  41. Online Safety Act 2023, section 10(3)
  42. Online Safety Act 2023, section 4
  43. Ofcom’s approach to implementing the Online Safety Act
  44. Online Safety Act 2023, sections 17 and 19
  45. Online Safety Act 2023, section 15(2)
  46. The additional duties for services likely to be accessed by children are set out in sections 11 – 13 and 28 – 30.
  47. Online Safety Act 2023, section 37
  48. According to section 35(1), a “children’s access assessment” is an assessment of a user-to-user service, search service or combined service - (a) to determine whether it is possible for children to access the service or a part of the service, and (b) if it is possible for children to access the service or a part of the service, to determine whether the child user condition is met in relation to the service or a part of the service.
  49. Online Safety Act 2023, section 35(3) and (4)(a)
  50. Schedule 3 sets out the time at which the provider must carry out the first children’s access assessment.
  51. Online Safety Act 2023, section 37(5)
  52. Online Safety Act 2023, section 143 and Schedule 13, paragraph 4(1)
  53. Online Safety Act 2023, Schedule 13, paragraphs 5(1) and (3)
  54. Online Safety Act 2023, section 144. A service is an ‘ancillary service’ in relation to a regulated service if the service facilitates the provision of the regulated service (or part of it) directly or indirectly, or displays or promotes content relating to the regulated service (or to part of it): section 144(11).
  55. Online Safety Act 2023, section 100(5). A facility is an ‘access facility’ in relation to a regulated service if the person who provides the facility is able to withdraw, adapt or manipulate it in such a way as to impede access (by means of that facility) to the regulated service (or to part of it) by United Kingdom users of that service: section 146(10).
  56. The criteria to be classified as an ‘eligible entity’ will be set out in regulations made by the Secretary of State: section 169(3).
  57. Online Safety Act 2023, section 169(1)
  58. Online Safety Act 2023, section 169(2)
  59. Ofcom’s approach to implementing the Online Safety Act - Ofcom
  60. Consultation: Protecting people from illegal harms online - Ofcom
  61. Consultation: Guidance for service providers publishing pornographic content - Ofcom
  62. Ofcom’s approach to implementing the Online Safety Act - Ofcom
  63. Ofcom’s approach to implementing the Online Safety Act
  64. Safer platform checklist: practical steps to protect your users from online harms - GOV.UK (www.gov.uk)

Ruth Stillabower, Trainee Solicitor, assisted in the preparation of this briefing.

Download file as PDF

Contact Us

Talk to us

Previous Contact
Next Contact

Latest News

Click here to visit our dedicated hub

Click here

Hide