As the European Commission prepares the pilot age verification tool, Meta calls for a harmonised EU framework giving parents control over minors access to digital platforms.

Meta has come out in support of a European Union-wide digital majority age that would require parental approval for younger teenagers to access digital services, including social media.

The company chose a pivotal moment for the announcement. On Tuesday, Denmark assumed the presidency of the Council of the EU, placing age verification and the digital safety of minors among its key priorities for the semester. Just days earlier, eleven member states sent a letter to the Commission urging action to restrict minors’ access to social media platforms. The call also coincides with the European Commission’s upcoming pilot project on age verification, likely to be presented later this month.

Parental control over bans

Meta released the statement on 3 July, positioning the company firmly against age-based social media bans, which have gained traction in some EU member states. The company argues that bans “take away parental authority” and narrowly target one category of service while leaving others unregulated.  

To be clear, our support for an EU-wide digital majority age is not an endorsement of government mandated social media bans. – Meta

Instead, Meta backs a model requiring explicit parental approval for app access by teens under 16. Citing a 2024 Morning Consult poll, the company notes that three-quarters of EU parents support this type of mechanism. 

Sector-wide approach vs. platform-specific rules

One of Meta’s key arguments is that regulation should not focus solely on social media. The company points to internal data suggesting that teens in Europe use more than 40 different apps per week, including gaming and streaming. A regulatory focus on social platforms alone, it warns, risks pushing teens toward less-regulated or unmoderated services, undermining the very protections policymakers aim to strengthen.

This cross-sectoral logic mirrors long-standing EU efforts to ensure a level playing field across the digital single market. However, any expansion of scope beyond social media would also raise new enforcement and implementation challenges for smaller platforms, many of which lack the compliance infrastructure of large tech firms.

You might be interested

European age verification system

The European Commission is to present its pilot project on age verification later this month, in line with the objectives of the Digital Services Act (DSA) and the AI Act. At a press briefing on 4 July, Commission spokesperson Thomas Regnier noted that the initiative lies with EU institutions, not individual companies:

This is not Meta’s competence. What we ask from Meta is to comply with the digital legislation, and this of course includes strong safeguards when it comes to the protection of minors. The pilot project on age verification will be delivered this July. – Thomas Regnier, Commission spokesperson

As pressure mounts from member states and civil society groups for stronger regulation, the question now is how to build consensus around a harmonised digital majority age. Supporters argue that an EU-wide standard would eliminate legal fragmentation, help enforce platform obligations under the DSA, and offer more consistent safeguards for teens across borders.

But challenges remain, from technical interoperability to data privacy and fundamental rights. The debate will likely intensify once the Commission publishes the pilot project and member states begin aligning, or diverging, on the path forward.

For now, Meta’s message is clear: parents should have the final say, and regulations must reflect the realities of the modern digital ecosystem, not just one subset of it.