
Security
EU Opens Formal Investigation Into Snapchat Over Child Safety Failings
March 27, 2026
Read Original: EuronewsThe European Commission opened formal proceedings against Snapchat on March 26, marking the platform's first investigation under the Digital Services Act. The Commission said it suspects Snapchat is failing to meet the DSA's child protection standards across five areas: age assurance, protection from grooming and criminal recruitment, default account settings that expose children to unsafe contact, the sale of illegal and age-restricted products on the platform, and a reporting system for illegal content that is neither easy to find nor easy to use. Snapchat has about 97 million monthly active users in the EU, a large portion of whom are teenagers and young adults.
EU Executive Vice-President Henna Virkkunen said Snapchat "appears to have overlooked that the Digital Services Act demands high safety standards for all users." The Commission specifically suspects that Snapchat's age assurance relies on self-declaration, which it considers insufficient to keep under-13s off the platform. It also suspects that the platform does not adequately distinguish users under 17 for age-appropriate content filtering, and that adults are able to pose as minors to contact children. The Dutch Authority for Consumers and Markets had already opened a separate investigation into vape sales on Snapchat last September. That probe is now being incorporated into the EU's broader investigation. Snapchat said it has been cooperating with regulators and that user safety is a top priority.
The DSA investigation of Snapchat follows earlier probes of TikTok for addictive design features, Facebook and Instagram for child protection shortcomings, and now four major pornographic websites for failing to prevent minors from accessing adult content. The pattern is clear: European regulators are moving through the major platforms one by one, using the DSA as a tool to force structural changes in how platforms handle underage users. Non-compliance risks fines of up to 6% of global annual revenue, and in extreme cases, a platform ban.
For social media marketers in Nigeria managing campaigns on Snapchat, this investigation signals incoming platform changes. Age-gating, content restrictions, and moderation adjustments tied to DSA compliance often roll out globally, not just in Europe. Features that currently exist on Snapchat targeting younger users may be restricted or removed as the company responds to regulatory pressure.
Understanding the regulatory direction tells you where platform capabilities are heading before the changes arrive.
Source:Euronews