The Parliament voted on Wednesday, 26 November that children across Europe should be at least sixteen years old to access social media. MEPs backed a set of recommendations to regulate the risks minors face online, from addictive design to deepfakes and nudify apps.
The non-legislative report was adopted by 483 votes in favour, 92 against and 86 abstentions. Lawmakers voiced ’grave concern’ over the physical and mental health effects of digital platforms on young users. Citing research that one in four minors displays addiction to smartphones, the Parliament urged the Commission to move quickly on new rules to protect children online.
Age-verification app as the key tool
In particular, the Parliament demands an EU-wide minimum age of 16 for access to social media platforms, video-sharing services and AI companionship tools. Teenagers between 13 and 16 would still be able to use such services, but under parental consent only.
The move builds on previous work by the Internal Market and Consumer Protection Committee (IMCO) . MEPs again stressed that age verification must protect privacy, supporting the Commission’s development of an EU age-verification app and the upcoming European Digital Identity Wallet. Currently, there is a pilot project running in five member states on the matter. Nevertheless, lawmakers warned that age checks cannot become a substitute for platforms redesigning their services to be safe for minors by default.
Rapporteur of the file, MEP Christel Schaldemose (S&D, DEN), led Parliament’s push for a higher digital age of consent. She said the vote showed lawmakers were united in drawing a line under what she described as years of unchecked experiments on children.
You might be interested
We are saying clearly to platforms: your services are not designed for children. And the experiment ends here. – MEP Christel Schaldemose (S&D, DEN)
“I am proud of this Parliament, that we can stand together in protecting minors online”, Ms Schaldemose told MEPs. “We are finally drawing a line. We are saying clearly to platforms: your services are not designed for children. And the experiment ends here.”
Earlier this year, Commission President Ursula von der Leyen applauded Australia’s under-16 social-media ban. She stated that “parents, not algorithms, should be raising our children”. To hold companies accountable, lawmakers also proposed making senior managers personally liable for serious and repeated breaches of EU digital-safety rules. This is more related to age-verification failures.
Bans on addictive design and algorithms
The Parliament wants the Commission to take aim at what it calls ’the most harmful addictive practices’. These include infinite scrolling, autoplay, reward loops, pull-to-refresh mechanisms and other forms of gamified design that keep young users online longer than intended.
MEPs also urged the Commission to outlaw loot boxes and similar randomised in-game features, including in-app currencies and ’fortune wheel’ rewards, citing their gambling-like effects on children. These proposals are expected to feed into the forthcoming Digital Fairness Act.
The Parliament also urge action on generative-AI tools. These include deepfakes, AI companionship chatbots, AI agents and AI-powered nudity apps, which are systems that create non-consensual manipulated images undressing people, usually women.
No financial incentives for kid influencers
Another announcement is regarding the application of stronger safeguards against platforms encouraging children to act as influencers. It proposes banning financial incentives for minors producing sponsored content and emphasises their vulnerability to commercial exploitation.
Parliament also called for action against targeted advertising, influencer marketing, and other persuasive technologies that shape young users’ online behaviour. MEPs asked the Commission to address these practices in the Digital Fairness Act.
Although non-binding, the Parliament’s vote increases pressure on the Commission ahead of the Digital Fairness Act, expected next year. However, it adds political pressure to discussions on age verification, addictive design and the responsibilities of digital platforms towards minors.