ARTICLE
6 February 2026

UK Online Safety In The Spotlight: Enforcement, Expansion, And The Importance Of Proportionality

KL
Herbert Smith Freehills Kramer LLP

Contributor

Herbert Smith Freehills Kramer is a world-leading global law firm, where our ambition is to help you achieve your goals. Exceptional client service and the pursuit of excellence are at our core. We invest in and care about our client relationships, which is why so many are longstanding. We enjoy breaking new ground, as we have for over 170 years. As a fully integrated transatlantic and transpacific firm, we are where you need us to be. Our footprint is extensive and committed across the world’s largest markets, key financial centres and major growth hubs. At our best tackling complexity and navigating change, we work alongside you on demanding litigation, exacting regulatory work and complex public and private market transactions. We are recognised as leading in these areas. We are immersed in the sectors and challenges that impact you. We are recognised as standing apart in energy, infrastructure and resources. And we’re focused on areas of growth that affect every business across the world.
The UK's online safety landscape is shifting fast. The regulator, Ofcom, is busy – alongside continuing to implement one of the most ambitious and complex online regulatory regimes...
United States Media, Telecoms, IT, Entertainment
Nusrat Zar’s articles from Herbert Smith Freehills Kramer LLP are most popular:
  • within Media, Telecoms, IT and Entertainment topic(s)
  • in South America
Herbert Smith Freehills Kramer LLP are most popular:
  • within Media, Telecoms, IT, Entertainment, Transport and Family and Matrimonial topic(s)
  • with Inhouse Counsel
  • with readers working within the Law Firm industries

The UK's online safety landscape is shifting fast. The regulator, Ofcom, is busy – alongside continuing to implement one of the most ambitious and complex online regulatory regimes in the world under the Online Safety Act 2023 (OSA), its early enforcement activity has ramped up significantly. This has coincided with fresh political attention on children's digital wellbeing: recent weeks have signalled a potential wholescale policy shift, including a consultation on restricting or prohibiting social media access for under 16s in the UK.

At the heart of this debate lies a familiar public law challenge – ensuring that any adoption of online safety measures remains proportionate, principled, and workable in practice. Against that backdrop, and in circumstances where the OSA is partway through its implementation, and enforcement is ongoing, in this article we consider the current proposals and highlight key considerations.

Key takeaways:

  • Recent policy momentum: The UK is witnessing renewed political focus on children's online wellbeing, with future debates likely to centre on the proportionality and feasibility of a proposed social media ban for under 16s.
  • Regulatory overlap: Recent developments come at a time when the OSA is still being implemented and Ofcom's enforcement activity is increasing, raising questions about clarity, consistency and regulatory capacity.
  • Proportionality and evidence: Any expansion of online safety regulation must be balanced, workable and grounded in robust evidence, particularly given concerns about unintended consequences and fundamental rights. Constructive input from industry and other stakeholders will be essential in getting the balance right and ensuring that any changes to the regime are effective and proportionate.

A potential social media ban for under-16s?

On 21 January 2026, the House of Lords voted in favour of what has been referred to as the "under 16 social media ban" via an amendment to the Children's Wellbeing and Schools Bill (the Bill). This amendment would require the Secretary of State to make regulations obliging all regulated user-to-user services to use "highly effective age assurance measures" to prevent children under the age of 16 from becoming or being users of those services. In substance, this would place a blanket obligation on user-to-user services to exclude under-16s altogether, echoing the approach recently adopted in Australia.

This is not the only notable feature of the Bill. Another Lords' amendment makes provision for regulations to prohibit the provision of a VPN service by relevant providers to a "child" (defined as a person under 18). While this amendment has received less publicity than the potential under 16 social media ban, it would also amount to a significant extension of the current OSA regime. It remains to be seen how the House of Commons will respond to these Lords' amendments.

Against a backdrop of increasing political pressure in the lead up the House of Lords' vote, on 19 January, the Department for Science, Innovation and Technology (DSIT) announced a wide-ranging consultation on children's digital wellbeing. The consultation, to last for three months, canvasses views on options including a total ban on children's access to social media, raising the digital age of consent, mandating more robust age assurance systems and restricting "addictive design features" (giving examples such as 'streaks' and 'infinite scrolling' features). The Prime Minister has emphasised the importance of waiting for this consultation before taking further steps in relation to any social media ban. The Government also acknowledges that lessons should also be learnt from other jurisdictions, including Australia.

It will be essential for those with views on the proposals – including industry – to contribute fully to the consultation and provide evidence-based insights to feed into the Government's decision-making (for tips on engaging with such consultations, see our blog).

Enforcement activity under the OSA

These developments have overlapped with an initial vigorous phase of enforcement activity by Ofcom, with investigations announced into over 90 sites and apps since duties under the OSA came into force just last year.

Notably, the Lords' vote on the proposed social media ban and DSIT's consultation came shortly after Ofcom opened its most high-profile investigation to date. That investigation concerns X, and follows widely-publicised reports that X's "Grok" AI chatbot was being used to generate and share content potentially amounting to non-consensual intimate imagery, child sexual abuse material and pornography accessible to children. X has since implemented measures to prevent the Grok account from being used to create intimate images of people but the investigation into whether X has complied with its duties under the OSA remains ongoing (although Ofcom has explained the limits of the regulation of chatbots under the OSA and that it is not separately investigating xAI as the provider of the standalone Grok service at this time).

This example demonstrates the challenges of reliance on traditional enforcement activity in a fast-moving area where a regulator needs to be nimble. Constructive dialogue can be of greater importance in securing immediate outcomes when significant issues arise. Of course, Ofcom needs "teeth" to back up that dialogue, but relationships between the regulator and the industry remain vital.

What are the potential implications of a ban for the UK online safety regime?

As explained above, the Government is consulting on various proposals and the direction of travel is not yet clear. But, given the particular focus on a potential social media ban, we highlight below some key considerations relating to this proposal from the perspective of interaction with the current OSA regime.

First, the current discussion of a social media ban represents a possible expansion of policy with respect to online safety. Questions arise as to whether a blanket exclusion of under-16s from social media sits comfortably alongside the OSA, which focuses on the proportionate prevention or mitigation of illegal or harmful content in children's online experience. Even though the OSA covers "lawful but harmful" content, this has focused on preventing the risk of exposure to inappropriate material defined as primary priority content or priority content (such as pornography, material relating to suicide or self-harm). However, policy language discussing the consultation considering a ban appears to go further. For example, DSIT's press release announcing the consultation references the "government's plan to boost children's wellbeing online, ensuring they have a healthy relationship with mobile phones and social media". Some may interpret the consultation and language from the Government as signalling a shift in the policy goals underpinning the OSA – moving from protecting children from illegal/harmful content to promoting a more "healthy" experience for children.

Secondly, any ban or related measures would add to the already uncertain and untested nature of the OSA. One key example is the currently untested debate around the balance of fundamental rights in the online safety space which we have previously discussed. Certainly, on its face, a blanket ban would be a further incursion on fundamental rights. Were such a ban to be introduced through primary legislation, challenges to it are likely to focus on arguments under the Human Rights Act 1998. The most obvious battleground is Article 10 ECHR (freedom of expression), but there are other potential issues. These include Article 8 privacy issues for adults who have to verify their age to have access to lawful material, and interference with the rights of affected platforms.

In any human rights challenge, a central issue would be proportionality: whether a blanket exclusion of under-16s from social media is a proportionate means of pursuing the stated policy aims. By contrast, if the ban were implemented through secondary legislation, the scope for challenge would be wider. In addition to human rights grounds, regulations could be challenged on classic domestic public law bases: illegality, irrationality, and procedural unfairness (potentially including whether the consultation has been run in a procedurally fair manner). As we have previously explained, the OSA has the concept of proportionality embedded within its statutory language. If more restrictive measures were implemented by amendments to the OSA, it will be important to see whether there is any change to that statutory focus on proportionality.

Other aspects of the OSA also remain unresolved, such as the precise meaning and scope of the statutory language around categorisation, as demonstrated by Wikimedia. With further OSA milestones — including more consultation and decisions on service categorisation — still to come, reaching a point where the regime is settled and certain may be some way off.

Thirdly, any ban or restrictions are likely to be enforceable by Ofcom. That would further expand Ofcom's remit and test its capabilities when it is already grappling with implementing one of the most complex regulatory regimes it has ever overseen. This also comes at a time when, notwithstanding Ofcom's increasing enforcement activity, there remain key questions over the effectiveness of the OSA which will only be fully answered as the regime continues to be implemented. For example, while Ofcom has, to date, demonstrated a willingness to open investigations quickly where it considers there is evidence of potential non-compliance, no major investigation has yet been concluded. It therefore remains to be seen how robustly – and how frequently – Ofcom is prepared to deploy its extensive enforcement powers and how far it is challenged by the industry. In addition, to date, some practical limits around enforcement under the OSA have been exposed - including circumvention of age assurance checks via VPNs and 4chan's challenge in the US to Ofcom's jurisdiction to enforce the OSA against it. In this context, adding further to the regulator's remit – and the enhanced pressure that brings - needs to be balanced against ensuring its effectiveness.

Finally, any ban or related measures would place a further regulatory burden on user-to-user services alongside the already complex regime set out by the OSA, and potentially also increase regulatory burdens for VPN provider services. While many user-to-user services may already have age assurance checks in place, it is unclear whether these will be deemed sufficient for future proposals and/or what further measures (such as restricting "infinite scrolling") might require from a compliance perspective. The response of industry will be key: recent news reports suggesting a major site has restricted its activities in the UK as a result of the requirements of the OSA highlight the potential consequences of ever-increasing regulation.

Closing thoughts

The momentum behind consideration of a proposed ban suggests that, regardless of whether the Lords' amendments to the Bill proceed, a new chapter in the regulatory story is unfolding in the online safety space. On one view, the Lords' voting in favour of a social media ban and the consultation on further measures reflect a perception that the OSA, as it currently stands, is not sufficient to protect children's interests online, even taking into account Ofcom's enforcement activity. Indeed, when announcing the DSIT consultation, the Technology Secretary, Liz Kendall, stated that the OSA "was never meant to be the end point", and that ministers would examine ways to go further to ensure children have "healthy" online experiences.

It is already clear that even a total ban would not end questions over the effectiveness of online safety regulation. Notably, children's and online safety organisations, including affected families, have opposed a total ban, describing it as a "blunt response", and warning of unintended consequences. That language will be familiar to lawyers who consider proportionality issues, demonstrating the importance of considering all the factors and complex nuances around a policy with such a significant social impact before making decisions.

The potential shift/expansion in legislative policy in this space will no doubt be one to monitor closely going forward but, as we have discussed, there is a live question as to the extent to which such developments will undermine or complement the ongoing implementation of the OSA regime. Perhaps the only thing that can be said with any certainty is that the coming year is likely to see more regulatory burden and scrutiny particularly on the largest user-to-user services.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

[View Source]

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More