Tech Firms Face Increasing Demands to Curb Illegal Content Going Viral
By Liv McMahon & Charlotte Edwards, BBC Technology Reporters
The UK’s communications regulator Ofcom is proposing tougher online safety measures aimed at preventing illegal content from spreading rapidly on digital platforms, particularly to protect children and vulnerable users. The consultation, launched on Monday, seeks public and industry feedback on new rules that could require tech companies to take more proactive steps in tackling harmful content.
Focus Areas: Viral Content, Harm Reduction, Child Protection
Ofcom’s proposals target three main areas:
-
Stopping Illegal Content Going Viral
Platforms could be mandated to halt the viral spread of illegal material, including content depicting imminent physical harm on livestreams. To strengthen enforcement, a reporting mechanism might be required for users to flag such dangerous streams. -
Tackling Harms at the Source
For larger platforms, there may be a requirement to use proactive technology, such as detection algorithms, to identify terrorist content or child-related harms before it spreads widely. -
Enhanced Protections for Children
Additional safeguards could include restrictions on sending virtual gifts to children during livestreams and limits on recording or sharing children’s livestreams, addressing concerns about exploitation and abuse.
Oliver Griffiths, Ofcom’s Online Safety Group Director, emphasized that while the UK’s existing online safety rules have helped, technology and associated risks evolve rapidly. "We’re holding platforms to account and launching swift enforcement action where we have concerns. But technology and harms are constantly evolving, and we’re always looking at how we can make life safer online," he said.
Tailored Approach for Platforms by Size and Risk
Ofcom acknowledges not all platforms face the same risk levels; therefore, some proposals would only apply to the largest tech firms with the highest exposure to harmful content, while others would affect all sites offering interactive livestreaming. For instance, platforms enabling a single user to broadcast live to many viewers could be required to implement systems to flag content depicting imminent danger.
Industry and Advocacy Group Reactions
BBC has approached major platforms such as TikTok, Twitch, and Meta (which owns Instagram, Facebook, and Threads) for comment on the proposals.
The moves come amid ongoing concerns about features on livestreaming platforms that may expose children to grooming or harmful content. TikTok raised its minimum livestreaming age from 16 to 18 in 2022 following investigations revealing abuses, while YouTube has recently announced raising its livestreaming age threshold to 16. However, some child protection advocates argue that the proposed measures do not go far enough. Ian Russell, chair of the Molly Rose Foundation, criticized the plans as temporary "sticking plasters" rather than comprehensive solutions. The foundation is named after his daughter, who tragically died by suicide after exposure to harmful online content. Russell called for stronger government intervention to compel companies to identify and mitigate all platform risks.
Similarly, Leanda Barrington-Leach, Executive Director of the children’s rights charity 5Rights, urged tech companies to embed child safety into product design holistically, instead of relying on incremental regulatory changes.
Conversely, Rani Govender from the NSPCC welcomed the consultation, highlighting that increased safeguards for livestreaming could significantly improve child safety in these high-risk online spaces.
Consultation Timeline and Next Steps
The public and stakeholders have until 20 October 2025 to submit responses to Ofcom’s consultation. The regulator aims to gather comprehensive input from service providers, civil society groups, law enforcement, and the public to inform future regulatory frameworks.
Ofcom’s efforts are part of the wider implementation of the UK’s Online Safety Act, which sets out statutory duties for platforms to protect users from harmful and illegal online content. Yet, the Act’s enforcement and scope have faced scrutiny, prompting ongoing discussions about the balance between regulation, innovation, and user safety.
For ongoing coverage of technology and online safety, sign up for the BBC’s Tech Decoded newsletter.
Related stories:
- Pornhub to Introduce ‘Government Approved’ Age Checks in UK (6 June 2025)
- Child Protection Summit Highlights Online Safety Challenges (17 June 2025)
- Ofcom Investigates 4chan and Porn Sites Under the Online Safety Act (10 June 2025)
© 2025 British Broadcasting Corporation. All rights reserved.