
Panoramic: Automotive and Mobility 2025
The European Commission has published the long-awaited guidelines clarifying how online platforms such as social media platforms, online marketplaces, app stores and other content-sharing services should protect minors under Article 28(1) of the Digital Services Act (“DSA”).1 Published on 14 July 2025, these guidelines form part of the broader EU BIK+ Strategy to foster a safer digital environment for children.2 While not legally binding, the guidelines reflect both the Commission's exercise of discretion and the outcome of extensive consultations and stakeholder input. As such, they are widely seen as the de facto gold standard for compliance. Given that minors are particularly vulnerable to online risks such as grooming, cyberbullying, illegal content, and manipulative commercial practices, the guidelines offer practical approaches for platforms accessible to minors. They clarify what it means to ensure a “high level of privacy, safety and security” for young users, as required by Article 28(1) DSA.
However, due to the level of detail of these guidelines, platforms face a complex challenge in identifying which rules apply to their services and how to integrate them into a coherent compliance framework that can withstand both regulatory scrutiny and private enforcement (by consumer protection associations). Therefore, this article, as a first tool to compliance, outlines the key takeaways for businesses from these very detailed guidelines. If you would like to discuss what they specifically mean for your organisation, our DSA practitioners are happy to assist.
The guidelines apply to all online platforms – as defined in Article 3(i) DSA – that are accessible to minors. Micro and small enterprises are exempt3, unless they are designated as a Very Large Online Platform (“VLOP”) or a Very Large Online Search Engine (“VLOSE”). A platform is considered accessible to minors if:
Importantly, a platform cannot avoid these obligations by simply stating that minors are not permitted: effective access restrictions must be in place.
It is also important to note that these guidelines do not aim to interpret the additional obligations set out in Section 5 of Chapter III of the DSA, which apply to VLOPs and VLOSEs. According to the Commission, VLOPs and VLOSEs should not assume that merely implementing the measures outlined in these guidelines will be sufficient for compliance. They may need to adopt additional measures beyond those specified here.
The Commission's guidelines are built on four foundational principles, which are interrelated and should always be considered by online platforms: measures must be appropriate and proportionate to the platform's context; children's rights must be a primary consideration; privacy, safety and security must be embedded by design; and services must be tailored to the developmental needs of minors.
Building on these principles, the Commission outlines a series of practical measures that platforms should implement to translate these values into day-to-day operations.
Given the diversity of online platforms, a one-size-fits-all approach to child safety is not effective. Each platform is therefore required to carry out a structured and recurring risk assessment that identifies how minors engage with the service and the risks they may encounter. The European Commission sets out minimum requirements that providers must consider, including the likelihood that minors will access the service, the types of risks they may face (based on the “5Cs” typology: content, conduct, contact, consumer, and cross-cutting risks) and the effectiveness of existing safeguards.
Platforms are expected to establish metrics to monitor these risks over time and to evaluate whether the measures in place disproportionately restrict children's rights, such as their freedom of expression or participation. This assessment must be updated annually or whenever significant changes occur, and should include input from minors, their guardians, and relevant experts. For VLOPs and VLOSEs, this process may be incorporated into the broader systemic risk assessments required under Article 34 of the DSA.
Age assurance, meaning measures that enable online platforms to verify the age of their users, is an effective way to create safer digital environments. These measures help providers enforce age-based access restrictions and protect minors from inappropriate content or interactions.
The guidelines stress that age assurance must be tailored to the nature of the service and the specific risks it presents. Platforms are therefore expected to assess whether age-based restrictions are necessary and, if so, to implement suitable methods. These may range from age estimation tools, which simply confirm whether a user meets a minimum age threshold, to more robust forms of age verification, such as the EU Digital Identity Wallet. Where feasible, platforms should offer multiple methods and ensure that users have access to redress mechanisms in cases of incorrect assessments.
The chosen method must be proportionate to the level of risk and comply with data protection principles. It should be accurate, reliable, robust enough to prevent circumvention, non-intrusive in terms of its impact on users' rights and freedoms, and non-discriminatory. The Commission explicitly states that self-declaration does not meet these standards and is therefore insufficient.
In addition to the above, if a platform determines that age assurance is necessary, the Commission views the registration process as a first opportunity to implement it in a proportionate way.
Further to the registration process, where this is required or offered, the process must be accessible and understandable for minors. This means, among other things, that platforms should clearly explain the benefits and risks of registration in child-friendly language, discourage underage sign-ups, and ensure that minors can easily delete their accounts. Children must not be encouraged to share more personal information than is strictly necessary, and parental consent should be obtained where required.
Once registered, minors' accounts must be configured by default to the highest level of protection. This includes limiting interactions such as likes and comments to approved contacts, disabling location sharing and tracking features unless explicitly enabled, and preventing visibility of minors' activity to unknown users. Features that promote excessive use – such as autoplay, streaks, or push notifications during sleep hours – should be turned off by default. In addition, filters that may negatively impact body image or mental health should also be disabled.
Platforms may choose to go beyond these minimum standards, especially for younger minors. They should consider whether certain settings ought to be made unchangeable for younger users and ensure that minors are not nudged into lowering their privacy protections. Where settings are adjustable, minors must be able to reset them easily and receive clear warnings when making changes.
Because the design of a platform directly influences how minors experience and navigate it, the Commission expects platforms to empower young users to make informed choices and exercise control over their digital environment. Interfaces must be age-appropriate and avoid persuasive design features such as infinite scroll or urgency cues. Platforms are also required to offer effective time management tools, including reminders and nudges, and ensure that all features are accessible to all users, including those with disabilities.
If AI tools, such as chatbots or filters, are integrated into the platform, they must not be activated by default or promoted to minors. Platforms must assess the risks associated with these features, clearly indicate when users are interacting with AI, and provide visible warnings about their limitations. These warnings must be written in child-friendly language and remain visible throughout the interaction.
Recommender systems and search functionalities play an essential role in shaping what minors see and engage with online. Platforms must ensure that these systems do not expose minors to harmful or illegal content, as identified through their risk assessments. This requires regular testing and adaptation, with input from minors, their guardians, and independent experts. Platforms should prioritise explicit user preferences – such as selected interests or direct feedback – over behavioural profiling, and should only use minors' activity across or beyond the platform when it serves the best interests of the minor.
Search functionalities must be designed to block known harmful terms and to redirect minors to appropriate support resources when risky queries are detected. Minors should have the ability to reset their recommended feeds, adjust content preferences, and understand why specific content is being shown to them. A recommender option that does not rely on profiling should be made available and, where appropriate, set as the default.
Minors are particularly vulnerable to commercial practices online, especially when these are personalised, persuasive, or disguised. Platforms must ensure that their monetisation strategies do not compromise minors' privacy, safety, or security. This includes taking into account minors' age, vulnerabilities, and limited ability to critically assess advertising or to manage exposure to excessive volumes of advertising. All commercial content must be clearly labelled and age-appropriate, and platforms must prevent exposure to harmful content, such as gambling, dieting products, or adult services, as identified in the risk assessment.
Moreover, monetisation features that obscure real-world value, such as tokens or loot boxes, should be avoided. Purchases must be priced in national currency and must not be required to access core features of a service presented as “free”. Platforms should also consider implementing tools that allow guardians to set spending limits or approve purchases.
These measures are to be applied in addition to existing legal frameworks, including the Unfair Commercial Practices Directive (2005/29/EC), and Articles 25, 26, and 28(2) of the DSA.
Effective moderation reduces minors' exposure to harmful content and behaviour. The guidelines build on existing obligations under the DSA and emphasise that moderation must be guided by the best interests of the child. Importantly, these measures must not result in a general obligation to monitor all user content, in line with Article 8(1) of the DSA.
Firstly, platforms must clearly and transparently define and communicate what constitutes harmful or illegal conduct for minors. Moderation policies must be enforced, regularly reviewed, and made available in all official languages of the Member States. Content moderation mechanisms must be active and functioning 24/7. While automated tools can be effective, human oversight remains essential, whereby at least one employee should always be available to respond to urgent requests.
AI-generated content must also be subject to moderation. Platforms should implement technical tools, such as prompt classifiers, to detect and prevent the dissemination of harmful material.
To ensure a high level of privacy, safety, and security for minors, they must be able to report harmful content or behaviour easily and safely. Reporting tools must be visible, age-appropriate, and accessible to all users. Adults should also be able to report content they consider inappropriate for minors. Anonymous reporting should be the default, and reports concerning minors must be prioritised. Feedback mechanisms – such as “show me less” or “this makes me uncomfortable” – should directly influence content visibility and recommender systems.
In addition to these tools, platforms must provide accessible support features that help minors navigate online environments safely and seek help when needed. These support tools must be clearly visible and connect minors to trusted resources, such as national helplines. While AI tools can assist in this regard, they should not be the primary means of support. Platforms should offer proactive safety prompts when risky content is accessed and allow minors to block or mute other users. Minors must also be able to control who can comment on their content and whether they wish to join group chats or communities. Group invitations should always require explicit consent.
Lastly, guardians play an important role in supporting minors' online safety, which is why the Commission encourages the use of guardian tools. These must be treated as complementary, not a substitute, for platform-level protections. Such tools should be easy to use, transparent to minors, and designed to support communication and autonomy rather than surveillance.
The European Commission underscores that online platforms must go beyond technical safeguards and embed child protection within their governance structures to ensure high standards of privacy, safety and security. This includes appointing a dedicated child safety lead, providing staff training, and monitoring compliance with Article 28(1). Platforms are expected to collect and analyse data on risks and harms, and to engage in cross-platform collaboration to share best practices. In addition, they must continuously monitor and evaluate the effectiveness of their safety features, adapting them in response to stakeholder input, technological developments and evolving risk profiles.
Last but not least, as already required under Article 14(3) of the DSA, terms and conditions must be written in language that minors can understand. The guidelines further specify that these terms should explain how the platform operates, what positive behaviour is expected from users, and how safety measures protect minors. Importantly, platforms must not only publish these terms but also uphold and implement them in practice.
Transparency is a key principle of the DSA, and platforms must clearly communicate all relevant measures, including age assurance methods, recommender systems, complaint procedures and AI tools. This information must be easy to locate, regularly updated, and presented in the official language(s) of the Member State where the service is offered.
Authored by I.S. Toepoel.
Final thoughts
These guidelines clarify the requirements under Article 28(1) of the DSA and signal a significant shift in how online platforms must address child safety. For some businesses, this may necessitate a fundamental redesign of services to better accommodate the needs of minors.
Again, if you would like to explore or discuss what these guidelines mean for your organisation, our DSA specialists are available to support you.
References