From Instagram to Snapchat to X, social media are addictive, manipulative, and generally dangerous dark corners of cyberspace, the Union’s lawmakers contend. They want to limit the risks by banning European children from using online platforms before their 16th birthday.
The Internal Market and Consumer Protection (IMCO) Committee voted its own-initiative report on Thursday 16 October, calling for new EU measures to make online platforms safer for choldren. The initiative, led by Danish Socialist MEP Christel Schaldemose, proposes a European minimum digital age of 16 for social media, video-sharing platforms and AI companions, unless with parental consent. At the same time, the MEPs suggest a minimum age of 13 for all social media use.
The motion passed by 32 in favour, five against and nine abstentions. This signals a growing concern among lawmakers that large digital platforms are failing to protect young users from addictive design, harmful content, and manipulative algorithms.
A minimum digital age
Ms Schaldemose’s report builds on a draft she tabled in June and the compromise amendments later negotiated with other political groups. Her proposal calls for tougher enforcement of the DSA (Digital Services Act) and new EU rules against addictive design, dark patterns and gambling-like features. The final version, now backed by a committee majority, transforms those ideas into a call for binding EU legislation.
“Firstly, we need a higher bar for access to social media, which is why we propose an EU-wide minimum age of 16”, Ms Schaldemose said. “Secondly, we need stronger safeguards for minors using online services. My report calls for mandatory safety-by-design and for a ban on the most harmful engagement mechanisms for minors.”
You might be interested
We need a higher bar for access to social media, which is why we propose an EU-wide minimum age of 16. – MEP Christel Schaldemose (S&D/DNK)
The committee’s vote comes at moment when the idea of a harmonised “digital age of consent” is being floated across Europe. Earlier this year, Commission President Ursula von der Leyen applauded Australia’s under-16 social-media ban. She stated that “parents, not algorithms, should be raising our children” in an echo of the current Parliament’s calls.
The age verification system is expected to rely on the EU Digital Identity Wallet and is currently running a pilot project in five member states. The system would confirm a user’s age without sharing any personal data. The platform only returns a simple “yes” or “no” outcome.
Addictive design, dark patterns
The report calls for a default ban on “addictive design” features. These are elements intentionally engineered to keep users online for long hours. These include infinite scroll, autoplay, disappearing stories, reward streaks, and constant notifications.
In the same spirit, MEPs want to outlaw gambling-like mechanics in video games. These include loot boxes, packages offering rewards, and in-game currencies which are currencies only for video games. Lawmakers argue that these features mimic betting behaviour and encourage children to overspend. The proposed bans would be written into the forthcoming Digital Fairness Act. Likely to take effect in 2026, the new EU law is to close gaps left by the DSA.
For Ms Schaldemose, such measures are overdue. “Platforms spend billions to make us addicted”, she told EU Perspectives in an earlier interview. “Blaming parents is simply unfair. The obligation must clearly fall on the platforms”. Her stance reframes the debate. Rather than placing the burden on families, the report demands that companies redesign their products with better responsibility.
During the debates on the report, the Danish MEP’s position sometimes collided with conservative and far-right MEPs, who argue that online safety should remain primarily a parental responsibility. Lawmakers from the ECR and ID groups warned against Brussels deciding how families raise their children.
Ban non-compliant platforms
Beyond looking into new laws, MEPs are pressing the European Commission to make full use of existing powers under the DSA. This includes investigating, fining or even banning non-compliant platforms. The report also suggests personal liability for senior managers in cases of serious breaches, to bring greater accountability to the tech sector.
Last week, the Commission formally requested information from Snapchat, YouTube, Apple Store and Google Play on how they protect children from harmful or illegal content under DSA rules. Since late 2023, Brussels has launched five formal DSA investigations targeting X, TikTok, AliExpress, Meta, and Temu. The TikTok Lite case closed after the company withdrew its rewards scheme. While other investigations remain ongoing, with outcomes still pending.
Digital habits of Europe’s youth
The timing of the IMCO vote coincided with a new Eurobarometer survey on social media use. The study shows how deeply social media has become embedded in young Europeans’ lives.
Among citizens with ages ranging from 15 to 24 years, 65 per cent say social media is their main source of news and information, far ahead of television. Moreover, nearly three-quarters, 74 per cent, follow influencers or content creators, often engaging with reviews, commentary and lifestyle content.
The survey also found that two-thirds of Europeans believe they were exposed to some type of disinformation in the past week. These conclusions underscore Parliament’s concern about what minors see online and how algorithms shape their worldview.
November plenary vote
The Parliament will debate and vote on the final resolution at its 24-27 November plenary sessions. If adopted, it will set the stage for new Commission proposals under the Digital Fairness Act, which will address manipulative interfaces, influencer marketing, and gambling-like features such as loot boxes.