The United Kingdom has moved decisively to close a gap that regulators had flagged since the Online Safety Act first passed: general-purpose AI chatbots were not covered. PM Keir Starmer's government announced on February 16 that ChatGPT, Google Gemini, and Microsoft Copilot would now fall within the Act's scope.
What Changed
Implications for Product Teams
Any product that incorporates a third-party AI chatbot now inherits potential compliance obligations under the Online Safety Act if it has UK users. Product teams building for under-18 audiences can look at how we approached this with our own [student-safe AI writing platform: PenLeap](https://penleap.com), which enforces age gates, filtered prompts, and learning-focused outputs by default.
- Audit every AI-powered conversational feature for potential child-facing exposure
- Review your age verification flow — self-reported date of birth is no longer sufficient
- Disable engagement-maximising features for users under 18
- Ensure vendor contracts with AI API providers include compliance obligations
- Monitor Ofcom guidance — implementation codes are still being published
Our web development and custom software teams are already advising clients building UK-facing products. Talk to us before your next release.
For a wider [founder take on AI regulation](https://viveksinra.com/blog) — and why the UK approach is likely to become a template for other jurisdictions — see viveksinra.com.

