Top Qs
Timeline
Chat
Perspective

Online Safety Act 2023

UK law to regulate online content From Wikipedia, the free encyclopedia

Online Safety Act 2023
Remove ads

The Online Safety Act 2023[1][2][3] (c. 50) is an Act of the Parliament of the United Kingdom to regulate online content. It was passed on 26 October 2023 and gives the relevant Secretary of State the power to designate, suppress, and record a wide range of online content that the United Kingdom deems illegal or harmful to children.[4][5]

Quick facts Long title, Citation ...

The Act creates a new duty of care for online platforms, requiring them to take action against illegal content, or legal content that could be harmful to children where children are likely to access it. Platforms failing this duty would be liable to fines of up to £18 million or 10% of their annual turnover, whichever is higher. It also empowers Ofcom to block access to particular websites. However, it obliges large social media platforms not to remove, and to preserve access to, journalistic or "democratically important" content such as user comments on political parties and issues.

The Act also requires platforms, including end-to-end encrypted messengers, to scan for child pornography, which experts say is not possible to implement without undermining users' privacy.[6] The government has said it does not intend to enforce this provision of the Act until it becomes "technically feasible" to do so.[7] The Act also obliges technology platforms to introduce systems that will allow users to better filter out the harmful content they do not want to see.[8][9]

The legislation has drawn criticism both within the UK and overseas from politicians, academics, journalists and human rights organisations, who say that it poses a threat to the right to privacy and freedom of speech and expression.[10][11][12] Supporters of the Act say it is necessary for child protection. The National Crime Agency says the legislation will "safeguard children" from online harms.[13] The National Society for the Prevention of Cruelty to Children (NSPCC) says the Act's passage was "a momentous day for children" that will help prevent abuse.[14] The Wikimedia Foundation and Wikimedia UK have said they will not implement age verification or identity checks, and requested that lawmakers to exempt public interest platforms from the Act's scope in 2023.[15][16] In August 2025, the Wikimedia Foundation lost a challenge to the Act in the High Court.[17]

Remove ads

Provisions

Summarize
Perspective

Scope

Within the scope of the Act is any "user-to-user service". This is defined as an Internet service by means of which content that is generated by a user of the service, or uploaded to or shared on the service by a user of the service, may be read, viewed, heard or otherwise experienced ("encountered") by another user, or other users. Content includes written material or messages, oral communications, photographs, videos, visual images, music and data of any description.[18]

The duty of care applies globally to services with a significant number of United Kingdom users, or which target UK users, or those which are capable of being used in the United Kingdom where there are reasonable grounds to believe that there is a material risk of significant harm.[18] The idea of a duty of care for Internet intermediaries was first proposed in Thompson (2016)[19] and made popular in the UK by the work of Woods and Perrin (2019).[20]

Duties

The duty of care in the Act refers to a number of specific duties to all services within scope:[18]

  • The illegal content risk assessment duty  
  • The illegal content duties
  • The duty about rights to freedom of expression and privacy
  • The duties about reporting and redress
  • The record-keeping and review duties

For services "likely to be accessed by children", adopting the same scope as the Age Appropriate Design Code, two additional duties are imposed:[18]

  • The children's risk assessment duties
  • The duties to protect children's online safety

For category 1 services, which will be defined in secondary legislation but are limited to the largest global platforms, there are four further new duties:[18]

  • The adults' risk assessment duties
  • The duties to protect adults’ online safety
  • The duties to protect content of democratic importance
  • The duties to protect journalistic content

Enforcement

The Act empowers Ofcom, the national communications regulator, to block access to particular user-to-user services or search engines from the United Kingdom, including through interventions by internet access providers and app stores. The regulator can also impose, through "service restriction orders", requirements on ancillary services which facilitate the provision of the regulated services.[21][22][23]

The Act lists in section 92 as examples (i) services which enable funds to be transferred, (ii) search engines which generate search results displaying or promoting content, and (iii) services which facilitate the display of advertising on a regulated service (for example, an ad server or an ad network). Ofcom must apply to a court for both Access Restriction and Service Restriction Orders.[18]

Section 44 of the Act also gives the Secretary of State the power to direct Ofcom to modify a draft code of practice for online safety if deemed necessary for reasons of public policy, national security, or public safety. Ofcom must comply with the direction and submit a revised draft to the Secretary of State. The Secretary of State may give Ofcom further directions to modify the draft, and once satisfied, must lay the modified draft before Parliament. Additionally, the Secretary of State can remove or obscure information before laying the review statement before Parliament.[24]

The Act contains provisions allowing eligible entities to bring super-complaints on behalf of consumers.[25] The process for doing so was set out in regulations in July 2025.[26]

Limitations

The Act has provisions to impose legal requirements ensuring that content removals do not arbitrarily remove or infringe access to what it defines as journalistic content.[21] Large social networks are required to protect "democratically important" content, such as user-submitted posts supporting or opposing particular political parties or policies.[27] The government stated that news publishers' own websites, as well as reader comments on such websites, are not within the intended scope of the law.[21][23]

Age verification

Section 12 of the Act states that service providers have a duty to prevent children from seeing "primary priority content that is harmful to children". This includes pornographic images, and content that encourages, promotes, or provides instructions for eating disorders, self-harm, or suicide. The Act says that service providers must use age verification or age estimation technology in order to prevent users from being able to access primary priority content unless they are appropriately-aged: the provision applies to all services that allow categories of primary priority content to be made available, including social networks and internet pornography websites.[1][28]

Other provisions

The Act adds two new offences to the Sexual Offences Act 2003: sending images of a person's genitals (cyberflashing),[29] or sharing or threatening to share intimate images.[30] The first conviction for cyberflashing under the new law occurred in March 2024 following a guilty plea.[31][32]

The Act also updates and extends a number of existing communication offences. The false communications offence contained in section 179 replaces the offence previously found in s127(2)(a) and (b) of the Communications Act 2003.[33][34] (The existing s127(1) offence remains in force.)[35] After the 2024 Southport stabbings and ensuing riots, several individuals were prosecuted for knowingly spreading "fake news". For example, Dimitrie Stoica was jailed for three months for falsely claiming in a TikTok livestream that he was "running for his life" from rioters in Derby.[36]

Section 181 creates an offence of sending a message (via electronic or non-electronic means) that "conveys a threat of death or serious harm".[37] This can be tried summarily or on indictment.[38]

Section 183 creates an offence of "sending or showing flashing images electronically" if it is "reasonably foreseeable that an individual with epilepsy would be among the individuals who would view it", the sender intends to cause that person harm, and they have "no reasonable excuse".[39][40] This is intended to try and prevent "epilepsy trolling".[41]

Section 184 makes "encouraging or assisting serious self-harm" a criminal offence. This is similar to the offence of encouraging or assisting suicide contained in the Suicide Act 1961.[42] The first conviction under this section occurred in July 2025. Tyler Webb used the messaging app Telegram to encourage a woman he had met on a mental health support forum to harm herself and send him pictures of the resulting injuries, and to attempt suicide while he watched on camera.[43][44]

Remove ads

Legislative process and timetable

Summarize
Perspective

In 2021, the draft bill was given pre-legislative scrutiny by a joint committee of Members of the House of Commons and peers from the House of Lords.[45] The 2021 draft bill included within scope any pornographic site which has functionality to allow for user-to-user services, but those which do not have this functionality, or choose to remove it, were not in scope in the draft published by the government.[18] Oliver Dowden, the Secretary of State for Digital, Culture, Media and Sport, addressed the House of Commons DCMS Select Committee to confirm he would consider a proposal during the pre-legislative scrutiny to extend the scope of the Act to all commercial pornographic websites.[46]

Section 212 of the final Act repeals part 3 of the Digital Economy Act 2017, which required mandatory age verification to access online pornography but was subsequently not enforced by the government.[47] According to the then government, the Act addresses the major concern expressed by campaigners such as the Open Rights Group about the risk to user privacy with the Digital Economy Act's requirement for age verification by creating, on services within scope of the legislation, "A duty to have regard to the importance of... protecting users from unwarranted infringements of privacy, when deciding on, and implementing, safety policies and procedures."[48][49][18] In February 2022, the Digital Economy Minister, Chris Philp, announced that the bill would be amended to bring commercial pornographic websites within its scope.[50]

Civil liberties organisations such as Big Brother Watch and the Open Rights Group criticised the bill for its proposals to restrain the publication of lawful speech that was otherwise deemed harmful, which may amount to censorship or the "silencing of marginalised voices and unpopular views".[22][27][51] As a result, in November 2022, measures intended to require big technology platforms to take down "legal but harmful" materials were replaced with the requirement to provide systems to avoid viewing such content.[8]

In September 2023, during the third reading in the Lords, Lord Parkinson presented a ministerial statement from the government stating that the controversial powers allowing Ofcom to break end-to-end encryption would not be used immediately. This followed statements from several tech firms, including Signal, suggesting they would withdraw from the UK market rather than weaken their encryption. Nevertheless, the provisions pertaining to end-to-end encryption weakening were not removed from the Act and Ofcom can at any time issue notices requiring the breaking of end-to-end encryption technology.[6]

Remove ads

Reception

Summarize
Perspective

Supporters of the Act have said it is necessary for online child protection.[13] Critics have said the Act grants the Government of the United Kingdom extensive powers to regulate speech, set enforcement priorities, and pressure platforms into removing content without judicial oversight.[12][52][11]

Academic responses

Alan Woodward, a cybersecurity expert at the University of Surrey, described the Act's surveillance provisions as "technically dangerous and ethically questionable", stating that the government's approach could make the internet less safe, not more. He added that the Act makes mass surveillance "almost an inevitability" as security forces would be liable to mission creep, using the justification of "exceptional circumstances" to extend searches beyond their original remit.[7] Elena Abrusci, a scholar at Brunel Law School, suggests that the OSA provides adequate legal basis for service providers to remove illegal content, but does not adequately protect users from disinformation and online harassment, which is necessary for ensuring political participation.[53]

Charities and human rights organisations

The National Society for the Prevention of Cruelty to Children (NSPCC) said the Act's passage was "a momentous day for children" and that it would help prevent abuse.[14] The Samaritans, which had lobbied to expand the Bill's reach to protect vulnerable users, also lent the final Act its cautious support. The organisation said it was a step forward, but criticised the government for not fulfilling its ambition to make the United Kingdom the "safest place to be online".[54][55]

British human rights organisation Article 19 warned that the Act is "an extremely complex and incoherent piece of legislation" and "fails in effectively addressing the threat to human rights" such as freedom of expression and access to information.[52] Mark Johnson, Legal and Policy Officer at Big Brother Watch, said it was a "censor's charter" that undermines the right to freedom of expression and privacy.[12] In February 2024, the European Court of Human Rights had previously ruled that requiring degraded end-to-end encryption "cannot be regarded as necessary in a democratic society" and was incompatible with Article 6 of the European Convention on Human Rights.[11]

Industry responses

A number of websites have stated that they would close as a result of the Act. London Fixed Gear and Single Speed, a forum for bicycle enthusiasts, announced their closure citing the high cost of legal compliance, along with Microcosm, a provider of forum hosting for non-commercial, non-profit communities.[56][57] Some sites have instead blocked UK users, including the controversial far-right network Gab, and Civit.ai, where some users create deepfakes of real people generating images that could be categorised as child pornography.[citation needed] Lobsters, a programming and technology focussed discussion site, announced that they would block UK users in order to comply, but following extensive discussion decided not to.[58]

Major technology firms have raised concerns over the Act's implications for user privacy and encryption. Apple Inc. called the legislation a "serious threat" to end-to-end encryption, warning that it could force the company to weaken security features designed to protect users from surveillance.[10] Meta Platforms similarly stated that it would rather have its WhatsApp and Facebook Messenger services blocked in the UK than weaken encryption standards.[59]

Some websites and apps stated they would introduce age verification systems for users in response to a 25 July 2025 deadline set by Ofcom.[60] These include pornographic websites, but also other websites and services such as networks Bluesky (verification via Kids Web Services (KWS)), Discord, Tinder, Bumble, Feeld, Grindr, Hinge, Reddit (verification via Persona) X, and Spotify.[61][62][63]

Political responses

In response to criticisms of the Act, Prime Minister Sir Keir Starmer has said it is necessary to protect children from online content such as "suicide sites".[64] The National Crime Agency, an agency under the Home Office, also insisted the legislation is necessary to safeguard children from online harms.[13]

Reform UK leader Nigel Farage, one of the bill's most vocal opponents, has called the Act "borderline dystopian" and said he would repeal it if elected to government.[65] Fellow Reform leader Zia Yusuf described the legislation as "an assault on freedom".[66]

The government of Jersey will not enforce the law in the territory, citing "inadequacies" in the legislation. There were early objections from the Jersey government that the law did not include a permissive extent clause to enable it to join in on enforcement, but acknowledged that the failure to include such a clause "in hindsight, might be a good thing". The territory will instead develop its own legislation for online safety.[67]

Public responses

Following the enactment of this law, there was a significant rise in downloads of VPN services by users in the UK, since these can circumvent age verification requirements by routing traffic through another country without such regulations.[68] Some users had also successfully bypassed photo-based age verification services, such as Persona, using images of characters from the video game Death Stranding.[69] A petition calling for the repeal of the law attracted over 500,000 signatures on the UK Parliament petitions website.[70]

Wikipedia responses

Public interest platforms such as Wikipedia have also raised strong objections, suggesting that the legislation risks undermining non-profit and community-governed websites. Rebecca MacKinnon of the Wikimedia Foundation said the Act was "harsh" and failed to distinguish between commercial tech giants and public knowledge projects.[71] Both the Foundation and Wikimedia UK have rejected calls to implement age verification or identity checks, citing concerns about data minimisation, privacy, and editorial independence.[15][72] In June 2023, they issued an open letter urging lawmakers to exempt public interest platforms from the Act's scope.[16][73]

In May 2025, the Wikimedia Foundation launched a legal challenge against potential designation as a "category one" service under the Act, which would subject Wikipedia to the most stringent requirements.[74] The Foundation warned that complying with the law would compromise Wikipedia's open editing model and invite state-driven censorship or manipulation. The Daily Telegraph reported in July 2025 that Wikipedia may restrict access for UK users if the government insists on full compliance.[75] The High Court of Justice dismissed the challenge in August 2025.[17][76]

Remove ads

See also

Remove ads

References

Loading related searches...

Wikiwand - on

Seamless Wikipedia browsing. On steroids.

Remove ads