The UK Online Safety Act Enforcement Countdown: What Platforms Must Have Ready Before Ofcom Acts

The UK Online Safety Act Enforcement Countdown: What Platforms Must Have Ready Before Ofcom Acts

Enforcement is no longer theoretical. The clock is ticking.

The UK Online Safety Act (OSA) has officially moved from legislation to execution. Ofcom, the appointed regulator, is gearing up to enforce compliance — and platforms that aren't ready could face fines of up to £18 million or 10% of global annual turnover (whichever is higher), service restrictions, and even criminal liability for senior managers.

If you run an online platform — whether it's a dating site, forum, social network, community app, or SaaS with user-generated content — this is the moment that matters.

Below is a practical, no-fluff breakdown of what platforms must have ready before Ofcom starts knocking.

1. A Completed (and Defensible) Risk Assessment

This is the foundation of the entire Act.

Ofcom doesn't expect a generic checklist — it expects a platform-specific risk assessment that demonstrates you understand:

  • What kinds of illegal content could realistically appear on your service
  • How children might access or be exposed to harm
  • How your features (search, chat, recommendations, uploads) increase or reduce risk
  • Where moderation or technical controls might fail

Crucially, this assessment must be:

  • Documented (not just "in your head")
  • Kept up to date as features or user behaviour changes
  • Used to drive real decisions, not filed away

If Ofcom asks "Show us how you assessed risk", you should be able to respond immediately.

2. A Children's Access & Safety Position (Even If You're 18+ Only)

Claiming "we're not for children" is not enough.

Ofcom expects platforms to demonstrate one of the following:

  • Robust age assurance that genuinely prevents children from accessing the service
  • Or, if children could access it, proportionate child safety protections based on risk

You must be able to explain:

  • Whether children can realistically access your platform
  • What steps you take to prevent or mitigate harm
  • Why those steps are appropriate for your size and risk profile

This applies even to niche, adult-focused platforms.

3. Clear, Enforced Content Policies

Policies are useless unless they're:

  • Clear to users
  • Aligned with the risks you identified
  • Actually enforced in real life

At minimum, platforms should have written policies covering:

  • Illegal content (terrorism, child sexual abuse material, fraud, threats)
  • Abuse, harassment, and hate content
  • Sexual exploitation and coercion
  • Self-harm and suicide-related content (where relevant)

Ofcom will look for consistency:

If your policy says content is banned, can you prove that it's removed promptly when reported?

4. A Functional Reporting & Complaints System

Users must be able to:

  • Report content easily
  • Report users easily
  • Understand what happens after they report

And platforms must be able to show:

  • Reports are reviewed within a reasonable timeframe
  • Decisions are explained where appropriate
  • Repeat offenders are dealt with proportionately

Screenshots, logs, and moderation records matter here.

If your reporting system exists but no one monitors it properly, Ofcom will not be impressed.

5. Active, Proportionate Moderation

The Act does not require every platform to have a 24/7 trust & safety team.

It does require moderation that is:

  • Appropriate for your platform size
  • Appropriate for your risk level
  • Actually happening

That could be:

  • Manual moderation by trained staff or founders
  • Automated detection tools
  • Keyword filtering
  • User reporting combined with review processes

What matters is that you can justify your approach and show evidence it works.

6. Governance: Someone Is Accountable

Ofcom expects safety to be a board-level or senior management responsibility.

You should be able to answer:

  • Who owns online safety internally?
  • Who signs off risk assessments?
  • Who reviews incidents or complaints?

Smaller companies don't need a full compliance department — but they do need clear accountability.

7. Records, Records, Records

If it isn't written down, it didn't happen.

Platforms should retain:

  • Risk assessments and updates
  • Moderation decisions
  • User reports and outcomes
  • Policy change logs
  • Internal reviews of safety incidents

Ofcom investigations are evidence-driven. Memory won't cut it.

8. A Plan for Regulatory Contact

Eventually, Ofcom will reach out — whether via an information request, audit, or investigation.

Be ready with:

  • A named point of contact
  • Up-to-date documentation
  • Clear explanations (not panic or silence)

Delays, incomplete answers, or disorganisation are red flags.

The Bottom Line

The Online Safety Act isn't about perfection.

It's about reasonableness, evidence, and accountability.

Platforms that struggle under enforcement won't be the smallest ones — they'll be the ones that:

  • Did nothing
  • Wrote generic documents they don't follow
  • Assumed Ofcom wouldn't notice

If you can demonstrate that you:

  • ✅ Thought carefully about risk
  • ✅ Took proportionate steps
  • ✅ Review and improve over time

…you are already ahead of a large chunk of the internet.

The countdown is real. Preparation is your best defence.

This article provides guidance only and does not constitute legal advice.