The S&D 6 principles for an EU approach on age assurance mechanisms

Protection of minors online

Today, many children spend hours every day in a digital world facing risks that no previous generations have faced. Far too many children and teens experience stress, anxiety, sleep problems and declining well-being from extensive screen time and use of social media, where addictive feeds, harmful content, and endless scrolling are not the exception but the norm.

Parents, educators, civic society, researchers and doctors have been urging policy makers to act on the protection of minors online for a long time. It is high time to answer this call! The S&D Group believes that age verification mechanisms and a minimum age for online social media platforms are a legitimate tool to provide a safer online experience to minors, provided that the implementation of such tool remains proportionate and respects fundamental rights.

Despite strong protection enshrined in EU legislation such as the DSA, AVMSD, the ePrivacy Directive and the GDPR, the European Union must take a step further to ensure an equal level of protection of minors across the Union. A healthy digital society is one where platforms serve people, not the other way around.

To ensure children’s well-being online, we must also encourage ethical business models, platform accountability, safe algorithms. It is the primary responsibility of online platforms and other digital services to ensure their product and services are safe for their users, in particular for minors. Age verification tools must go hand in hand with a broader set of measures such as efficient content moderation obligation, algorithm transparency, prohibition of addictive designs (e.g. infinity scroll, auto play, streaks), dark patterns, and targeted advertising. Online platforms must be held accountable for how their algorithms shape user behaviour and in this context ensure access to age- appropriate information and learning resources online for all children, including support resources to empower children’s understanding of the world their live in.

Consistent with the UN Convention on the Rights of the Child, the S&D Group underlines that the responsibility for ensuring children's safety online must not fall solely on parents even if they play an important role in guiding their children's digital experiences. In this regard, it is important that platforms provide parents with voluntary, easy-to-find, effective, and rights-respecting parental control tools. Furthermore, no parent truly knows how their child’s data is being processed, nor which algorithms shape their online experience, whether their data is sold to third parties, or even if it is transferred abroad without consent. Parents need transparency and education, and platforms should be required to provide plain-language explanations about how algorithms recommend content, what data is collected, how risk mitigation systems operate and offer different child- friendly feeds.

Empowering children through education is a cornerstone of sustainable, rights-based online protection, and must go hand in hand with platform accountability and supportive parental engagement. The S&D Group also stresses the need to invest in comprehensive digital literacy programmes as part of national education and community strategies.

Finally, minors’ mental and health issues linked to online activities are already clearly visible. A stronger EU Mental Health Action Plan should tackle the growing impact of digital technologies on young people and propose clear policies to address related mental and physical harms.

It is urgent to protect our children online. For the S&D Group, in light of recent technological developments, age verification tools can provide an additional layer of security provided that they respect a number of principles.

S&D’s 6 principles towards age verification tools

  • Firstly, age assurance mechanisms are not a silver bullet and must be part of a holistic approach to protect minors online together with additional measures;
  • Secondly, age verification mechanisms should be mandatory where it is strictly necessary and proportionate: to access to goods or services that restricted by law for minors (e.g. pornography, gambling, alcohol, tobacco) and to access to online social media platforms, video-sharing platforms and AI companion applications and Chatbot in specific cases laid down in principle 5;
  • Thirdly, any age verification tool deployed should guarantee the highest level of accuracy, robustness and should fulfil strict data protection and cybersecurity criteria to ensure its compliance with our fundamental rights; we strongly oppose techniques such as biometric categorisation, including facial recognition, and behavioural profiling as they pose unacceptable risks to individual’s rights to privacy;
  • Fourthly, a harmonised EU-wide age assurance framework is needed following a risk-based approach with clear definitions of high-risk online platforms and digital services based on the risks they pose for minors’ health, well-being, privacy, rights and freedoms;
  • Fifthly, an EU wide minimum age of 13 years old should be established for children to access online social media platforms. In addition, a higher EU wide minimum age limit of 16 years old should be introduced following a risk-based approach namely to access social media platforms, video-sharing platforms and AI chatbots that are not merely ancillary features which put significant risks to minors’ health, well-being, privacy, rights and freedoms. Member States may choose to set a lower minimum age between 13 and 16 years, as a derogation from the EU 16 years old age limit;
  • Sixthly, age verification mechanisms must not lead to digital exclusion and Member States should be required to offer alternatives age verification mechanisms besides the eID solution;

S&D press contact