Tech
UK Regulators Demand Social Media Platforms Strengthen Child Protection Measures

British regulators have intensified pressure on major social media companies to strengthen protections for children using their platforms, warning that existing safeguards are not strong enough to prevent minors from accessing harmful content. Authorities have urged companies including Meta, TikTok, Snapchat and YouTube to introduce stricter age verification systems and improve safety controls. The move comes as part of the next enforcement phase of the United Kingdom’s Online Safety Act. Regulators say the growing presence of children on social media platforms requires urgent action to reduce exposure to inappropriate material and potentially addictive digital environments.
The demand for stronger safeguards was issued by Ofcom and the Information Commissioner’s Office, two of the country’s leading media and privacy regulators. Both agencies say they are increasingly concerned about how algorithm driven content feeds can expose young users to harmful material or encourage excessive screen use. Under the new regulatory push, companies have been given a deadline to show how they will improve age verification systems and prevent underage users from accessing services that are not designed for them. Authorities also want platforms to restrict contact between children and unknown adults and to stop testing new digital products on minors.
Regulators argue that technology companies have the tools needed to implement stronger protections but have not used them effectively. Officials say modern digital identification and age assurance technologies are widely available and can help platforms verify the age of users more accurately. The new requirements are designed to ensure that children under the age of thirteen are unable to access services that are restricted to older users. Lawmakers and regulators say these measures are necessary to address growing concerns about online safety and the psychological impact of social media on young audiences.
Senior officials have warned that enforcement action could follow if companies fail to comply with the new requirements. Ofcom has made clear that it expects major technology firms to demonstrate concrete progress in improving safety standards. The regulator has the authority to impose significant financial penalties if companies fail to meet legal obligations under the Online Safety Act. In some cases the fines could reach up to ten percent of a company’s global revenue, making compliance a major priority for technology firms operating in the British market.
Social media companies have responded by defending their current safety systems while acknowledging the importance of protecting younger users. Representatives from Meta said the company already uses artificial intelligence to identify underage accounts and places teenage users in specially designed environments that limit exposure to certain features. Company officials also suggested that stronger age verification could be implemented through mobile app stores rather than requiring individual platforms to collect additional personal information from families.
Other technology companies have highlighted their own safety efforts. YouTube said it already provides age appropriate experiences for younger audiences and emphasized that enforcement efforts should focus primarily on services that pose the highest risks. Roblox, a platform widely used by children and teenagers, noted that it had introduced more than one hundred safety improvements during the past year. These changes include mandatory age checks for certain communication features intended to prevent adults from contacting young users inappropriately.
The regulatory crackdown is taking place as governments around the world consider stronger rules for protecting children online. Britain has been examining whether to introduce broader restrictions on social media access for young users, including the possibility of banning children under sixteen from using some platforms. Similar policies have been debated or introduced in several countries, reflecting growing global concern about how digital platforms influence youth behavior, mental health and online safety.
Recent enforcement actions have already signaled that regulators are willing to penalize companies that fail to meet child protection standards. Earlier this year the UK privacy watchdog issued a major financial penalty against an online platform for failing to introduce effective age verification systems and for mishandling children’s personal data. Regulators say such actions demonstrate that companies must take legal obligations seriously. The latest warnings to major social media firms suggest that authorities intend to push for faster and more comprehensive changes across the industry.













