The Online Safety Act 2023 is an act of the Parliament of the United Kingdom to regulate online content. It was passed on 26 October 2023 and gives the relevant secretary of state (The Secretary of State for Science, Innovation and Technology) the power to designate, suppress, and record a wide range of online content that they deem illegal or harmful to children.
Blocked Content
Services must take robust action against illegal content and activity. They must assess risks, put in place systems/processes to reduce those risks, and remove illegal content when it appears. Such content includes
- child sexual abuse
- controlling or coercive behaviour
- extreme sexual violence
- extreme pornography
- fraud
- racially or religiously aggravated public order offences
- inciting violence
- illegal immigration and people smuggling
- promoting or facilitating suicide
- intimate image abuse
- selling illegal drugs or weapons
- sexual exploitation
- terrorism
For services which are likely to be accessed by children, there’s a duty to prevent them from encountering certain legal but harmful content (for example content that encourages or provides instructions for self-harm, or content that is age-inappropriate). Such content includes:
- pornography
- content that encourages, promotes, or provides instructions for either:
- self-harm
- eating disorders or
- suicide
- bullying
- abusive or hateful content
- content which depicts or encourages serious violence or injury
- content which encourages dangerous stunts and challenges; and
- content which encourages the ingestion, inhalation or exposure to harmful substances.
For services hosting adult / pornographic content (in scope), there’s a requirement for “highly effective age assurance” so that under-18s do not access such material
Sites will be required to rapidly remove illegal suicide and self-harm content and proactively protect users from content that is illegal under the Suicide Act 1961. The Act has also introduced a new criminal offence for intentionally encouraging or assisting serious self-harm. Services that are likely to be accessed by children must prevent children of all ages from encountering legal content that encourages, promotes or provides instruction for suicide and self-harm.
Protection of Women and Girls
- Much of the most harmful illegal online content (i.e.harassment, stalking, controlling or coercive behaviour, extreme pornography, intimate image abuse) disproportionately affects women and girls, and the Act requires platforms to proactively tackle this.
- Ofcom is required to consult with the Victim’s Commissioner and Domestic Abuse Commissioner to guarantee that the voices and views of women, girls and victims are reflected.
- The Act also requires Ofcom to produce guidance that summarises in one clear place the measures that can be taken to tackle the abuse that women and girls disproportionately face online
Misinformation and Disinformation
The Online Safety Act takes a proportionate approach to mis- and disinformation by focusing on addressing the greatest risks of harm to users, whilst protecting freedom of expression.
Mis- and disinformation will be captured by the Online Safety Act where it is illegal or harmful to children. Services will be required to take steps to remove illegal disinformation content if they become aware of it on their services. This includes the removal of illegal, state-sponsored disinformation through the Foreign Interference Offence, forcing companies to take action against a range of state-sponsored disinformation and state-linked interference online. Companies must also assess whether their service is likely to be accessed by children and, if so deliver additional protections for them. This includes protections against in-scope mis- and disinformation.
Regulation
- Ofcom is designated as the regulator for this Act. It can set codes of practice, monitor compliance, issue fines and (in some cases) block services or websites.
- Non-compliance can lead to significant fines: up to £18 million or 10% of annual global turnover, whichever is higher
- The Act gives Ofcom the powers they need to take appropriate action against all companies in scope, no matter where they are based, where services have relevant links with the UK.
- This means services with a significant number of UK users or where UK users are a target market, as well as other services which have in-scope content that presents a risk of significant harm to people in the UK.
- The Online Safety Act also requires Ofcom to establish an advisory committee on disinformation and misinformation to build cross-sector understanding of mis- and disinformation. The advisory committee has now appointed a Chair and plans to have its first meeting in April 2025.
https://www.gov.uk/government/publications/online-safety-act-explainer/online-safety-act-explainer





