UK Online Safety Act — now in force

The compliance layer
that protects children
and your platform.

100,000 UK platforms are legally required to protect children from online predators. GuardLayer is the plug-in infrastructure that makes it possible — and keeps you compliant with Ofcom.

7,263
Recorded online grooming offences in the UK last year
£18M
Maximum Ofcom fine per compliance breach
100K+
Platforms legally required to comply with the OSA
Ofcom opens enforcement investigation — gaming platform · March 2026
291,273 reports of child sexual abuse material detected in 2024 · NSPCC
Roblox ordered to implement age checks by Ofcom deadline · 2026
30% of Milton Keynes children approached by strangers online · Police data
90% of CSAM is self-generated — children groomed into producing it · IWF
OSA Phase 3 additional duties arrive Summer 2026 · Ofcom
Ofcom opens enforcement investigation — gaming platform · March 2026
291,273 reports of child sexual abuse material detected in 2024 · NSPCC
Roblox ordered to implement age checks by Ofcom deadline · 2026
30% of Milton Keynes children approached by strangers online · Police data
90% of CSAM is self-generated — children groomed into producing it · IWF
OSA Phase 3 additional duties arrive Summer 2026 · Ofcom

Gaming platforms are the new hunting ground. The law has changed. Most studios haven't.

Predators deliberately use gaming platforms because parents have no visibility there. The Online Safety Act makes this a legal emergency — not just a moral one.

Console chat is a blind spot
No parental control app can monitor in-game voice or text chat on Xbox, PlayStation, Roblox, or Fortnite. Predators know this — and exploit it deliberately.
0
Third-party apps with real in-game chat monitoring
Non-compliance is no longer an option
The Online Safety Act applies to every platform with UK users. Size is no defence. Fines reach £18M or 10% of global revenue — and executives face personal criminal liability.
100K+
Platforms in scope, most without compliance infrastructure
🧩
Existing tools don't reach games
Bark, Qustodio, and every major parental control app are built for phones and social media. None of them have native in-game chat detection. The gap has existed for years.
£0
Dedicated UK B2B gaming compliance infrastructure — until now
📅
Ofcom enforcement is ramping up
Ofcom issued its first OSA fines in 2025 and has opened dozens of investigations. Phase 3 duties arrive Summer 2026. The window to get compliant is closing fast.
12+
Ofcom investigations opened under the OSA in 2025 alone

Three products. One integration. Complete coverage.

GuardLayer plugs into your platform via API. You stay focused on building games. We handle the compliance, the monitoring, and the evidence — automatically.

🛡
GuardLayer Detect
AI monitors in-game text and voice chat in real time. Flags grooming patterns, not just keywords. Sends instant parent alerts with an override option.
📋
GuardLayer Comply
Auto-generates your risk assessments, compliance documentation, and Ofcom transparency reports. Updates automatically as legislation changes.
🗄
GuardLayer Report
Secure evidence vault for every flagged incident. One-click NCA referral packages. Court-admissible documentation with full chain of custody.
Live threat detection
Grooming pattern detected
Now
Unknown user (age unverified) requested personal information and attempted to move conversation to external platform. Child account: User_4821.
Age fishing Platform exit attempt Personal info request
Parent notification sent
0:03 ago
SMS and push notification delivered to registered parent. Conversation recorded and held pending parent review. Override window: 15 minutes.
Parent alerted Recorded Awaiting review
Compliance dashboard
Illegal content risk assessment
Complete
Children's risk assessment
Complete
Active content moderation
Live
In-platform reporting system
Active
Age assurance deployed
Verified
!
Phase 3 transparency report
Due Jul 2026
5 of 6 duties met · Next Ofcom deadline: 31 July 2026
Evidence vault
INC-2026-0441
02 May 2026 · 21:47
Grooming pattern flagged. Full conversation log captured (text + voice transcript). Platform: Roblox via API. Child account verified under-13.
INC-2026-0398
28 Apr 2026 · 18:22
Image solicitation attempt detected. Conversation terminated automatically. Evidence package prepared for law enforcement referral.

Is your platform at risk of an Ofcom fine?

Answer four questions. Get your compliance risk score, your fine exposure, and a specific breakdown of what Ofcom could hold you accountable for — in under 60 seconds.

1
Instant risk score
Rated across all active OSA duties relevant to your platform type
2
Fine exposure calculation
Actual maximum Ofcom fine based on your specific gaps
3
Specific findings
Exactly what Ofcom would cite and what GuardLayer fixes
GuardLayer — OSA Compliance Audit

Step 1 of 4 — Platform type

What best describes your platform?

Online Safety Act 2023
Core legislation — now in force
Ofcom enforcement
Active investigations, 2025–2026
GDPR / UK GDPR
Data protection compliance
NCA referral protocol
Evidence chain of custody
Phase 3 duties
Summer 2026 — arriving soon

Simple, transparent pricing.
Built for every platform size.

No setup fees. No long-term contracts. Cancel anytime. All plans include the full GuardLayer API integration and dedicated onboarding support.

Starter
£299
per month · up to 10,000 monthly active users
  • GuardLayer Detect — text chat monitoring
  • Basic compliance dashboard
  • Illegal content risk assessment auto-generated
  • In-platform reporting module
  • Email parent alerts
  • Ofcom guidance updates
Get started
Enterprise
Custom
bespoke pricing · unlimited users
  • Everything in Growth
  • White-label integration
  • Dedicated compliance manager
  • SLA with uptime guarantee
  • Direct Ofcom liaison support
  • Custom AI model training on your platform
  • Multi-platform deployment
Talk to us

The numbers behind the urgency

7,263
Recorded grooming offences in 2024
The tip of the iceberg — the vast majority occur on encrypted platforms invisible to law enforcement.
90%
Of CSAM is self-generated
Children are groomed into producing images themselves. Predators use gaming platforms to make first contact before moving elsewhere.
30%
Of children approached online
In Milton Keynes alone — 30% of young people have been approached by strangers online, most frequently on Roblox. Nationally the numbers are far higher.

Ready to protect your platform and your children?

Book a free 30-minute demo. We'll walk through your specific platform, your compliance gaps, and exactly how GuardLayer closes them — before your next Ofcom deadline.

Or send us a message with any questions first.

No commitment. We'll respond within one business day.