Last Updated on August 18, 2025
By now, if you live in the UK then you’ve probably come across something online that’s been subject to the Online Safety Act. Although the likelihood is largely dependant on your gender and age. The act is meant to keep our children and young people safe from adult and harmful content but it’s also causing issues all across the internet. For example for the last few months – the discussion forum of my local football club has turned off new registrations because they were worried about the implications. Indeed, many sites have decided to close down just because it wasn’t worth the risk of huge fines.
So what’s a football club forum got to do with harmful and adult content – well pretty much nothing really. However, the content is largely based on the discussion of the fans so potentially anything could be posted there – including adult and harmful content. Which is why site owners are getting nervous as they would be held repsonsible for any breach and potential fines can be huge!
Here’s some of the issues involved –
Online Safety Acts – Sites in Scope.
In fact, under the UK Online Safety Act, the types of content that require age or identity verification fall mainly into the following categories:
? 1. Pornographic and Adult Content
-
Explicit sexual material (porn sites, adult cam services, erotic video sharing platforms).
-
These sites must verify that users are 18+ using methods such as ID scans, credit card checks, or third-party age-verification services.
-
Ofcom (the regulator) has already started investigations into major porn platforms that serve millions of UK users.
?? 2. Content “Harmful to Children”
This is broader and less clear-cut, which is why many sites are nervous. Platforms must ensure that minors cannot normally encounter such material unless their age is verified. Examples include:
-
Violence & Gore
Graphic depictions of death, serious injury, or extreme violence (including protest footage flagged as “violent”). -
Self-Harm & Suicide Content
Sites that host or promote self-harm methods, eating disorders, or suicide discussions. -
Illegal Drugs & Substance Abuse
Content promoting or instructing on drug use. -
Hate Speech or Extremist Content
Encouraging terrorism, racist violence, or radicalisation materials. -
Tobacco, Gambling, Alcohol (where promotion or consumption is the focus)
Online casinos, betting platforms, vaping promotions, etc. already have restrictions but now face stricter verification layers.
?? 3. Social Media & General Platforms
-
Large platforms like Wikipedia, Reddit, YouTube, TikTok, X fall under “Category 1” if they have a substantial UK presence.
-
They must ensure under-18s are shielded from harmful or pornographic content, which may mean:
-
Age gating certain posts (you must verify to see violent or sexual images).
-
Restricting entire sections (e.g., NSFW subreddits).
-
Geoblocking content if compliance is too difficult (as happened with Gab and Civit.ai).
-
? What Doesn’t Require Verification
-
General news (unless containing violent/explicit imagery).
-
Educational material (history, science, politics, unless deemed “harmful” to minors).
- Mainstream entertainment platforms (BBC iPlayer, ITVX, Netflix, etc. aren’t required to age-verify unless showing adult material).
So basically,
-
Porn = always requires age verification.
-
Potentially harmful to children = may require age gates (violence, self-harm, gambling, drugs, extremist content).
-
General use sites may need to segment and gate parts of their platform, depending on how Ofcom interprets risk
So you can imagine how thousands of small forums often run by single individuals can be extremely worried about policing the content on their sites. Remember, the fines can run into millions and potentially all it would take is an unmoderated post which happened to have something harmful on it.