This document is an excerpt from the EUR-Lex website
Document 52025XC05519
Communication from the Commission – Guidelines on measures to ensure a high level of privacy, safety and security for minors online, pursuant to Article 28(4) of Regulation (EU) 2022/2065
Communication from the Commission – Guidelines on measures to ensure a high level of privacy, safety and security for minors online, pursuant to Article 28(4) of Regulation (EU) 2022/2065
Communication from the Commission – Guidelines on measures to ensure a high level of privacy, safety and security for minors online, pursuant to Article 28(4) of Regulation (EU) 2022/2065
C/2025/6826
OJ C, C/2025/5519, 10.10.2025, ELI: http://data.europa.eu/eli/C/2025/5519/oj (BG, ES, CS, DA, DE, ET, EL, EN, FR, GA, HR, IT, LV, LT, HU, MT, NL, PL, PT, RO, SK, SL, FI, SV)
|
Official Journal |
EN C series |
|
C/2025/5519 |
10.10.2025 |
COMMUNICATION FROM THE COMMISSION
Guidelines on measures to ensure a high level of privacy, safety and security for minors online, pursuant to Article 28(4) of Regulation (EU) 2022/2065
(C/2025/5519)
1. INTRODUCTION
|
1. |
Online platforms are increasingly accessed by minors (1) and can provide several benefits to them. For example, online platforms may provide access to a wealth of educational resources, helping minors to learn new skills and expand their knowledge. Online platforms may also offer minors opportunities to express their views and connect with others who share similar interests, helping minors to build social skills, confidence and a sense of community. By playing on and exploring the online environment, minors can also foster their natural curiosity, engaging in activities that encourage creativity, problem solving, critical thinking, agency and entertainment. |
|
2. |
There is, however, wide consensus among policy makers, regulatory authorities, civil society, researchers, educators and guardians (2) that the current level of privacy, safety and security online of minors is often inadequate. The design and features of the wide variety of online platforms and the services offered by providers of online platforms accessible to minors may create risks to minors’ privacy, safety and security and exacerbate existing risks. These risks include, for example, exposure to illegal content (3) and harmful content, that undermines minors’ privacy, safety and security or that may impair the physical or mental development of minors. They also include cyberbullying or contact from individuals seeking to harm minors, such as those seeking to sexually abuse or extort minors, human traffickers and those seeking to recruit minors into criminal gangs or promote violence, radicalisation, violent extremism and terrorism. Minors may also face risks as consumers as well as risks related to extensive use or overuse of online platforms and exposure to inappropriate or exploitative practices, including in relation to gambling and gaming. The increasing integration of artificial intelligence (‘AI’) chatbots and companions into online platforms as well as AI driven deep fakes may also affect how minors interact with online platforms, exacerbate existing risks, and pose new ones that can negatively affect a minor’s privacy, safety and security (4). These risks can originate from the direct experience of the minor with the platform and/or from the actions of other users on the platform. |
|
3. |
These guidelines aim to support providers of online platforms in addressing these risks by providing a set of measures that the Commission considers will help these providers to ensure a high level of privacy, safety and security of minors on their platforms, which will contribute to the protection of minors, which is an important policy objective of the Union. These guidelines also aim at helping the Digital Services Coordinators (DSCs) and competent national authorities when applying and interpreting Article 28 of Regulation (EU) 2022/2065. For instance, making minors’ accounts more private will, among others, help providers of online platforms reduce the risk of unwanted or unsolicited contact. Implementing age assurance measures (5) may, among others, help providers reduce the risk of minors being exposed to services, content, conduct, contacts or commercial practices that undermine their privacy, safety and security. Adopting these and other measures – on matters ranging from recommender systems and governance to user support and reporting – may help providers of online platforms make online platforms safer, more secure and more privacy preserving for minors. |
2. SCOPE OF THE GUIDELINES
|
4. |
It is in the light of the aforementioned risks that the Union legislature enacted Article 28 of Regulation (EU) 2022/2065 of the European Parliament and the Council (6). Paragraph 1 of this provision obliges providers of online platforms accessible to minors to put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service. Paragraph 2 of Article 28 of Regulation (EU) 2022/2065 prohibits providers of online platform from presenting advertisements on their interface based on profiling, as defined in Article 4, point (4), of Regulation (EU) 2016/679 (7), using personal data of the recipient of the service when they are aware with reasonable certainty that the recipient of the service is a minor. Paragraph 3 of Article 28 of Regulation (EU) 2022/2065 specifies that compliance with the obligations set out in this Article shall not oblige providers of online platforms accessible to minors to process additional personal data in order to assess whether the recipient of the service is a minor. Paragraph 4 of Article 28 of Regulation (EU) 2022/2065 provides that the Commission, after consulting the European Board for Digital Services (‘the Board’), may issue guidelines to assist providers of online platforms in the application of paragraph 1. |
|
5. |
These guidelines describe the measures that the Commission considers that providers of online platforms accessible to minors should take to ensure a high level of privacy, safety and security for minors online, in accordance with Article 28(1) of Regulation (EU) 2022/2065. The obligation laid down in that provision is addressed to providers of online platforms whose services are accessible to minors (8). Recital 71 of that Regulation further clarifies that ‘[a]n online platform can be considered accessible to minors when its terms and conditions permit minors to use the service, when its service is directed at or predominantly used by minors, or where the provider is otherwise aware that some of the recipients of its service are minors’. |
|
6. |
As regards the first scenario described in that recital, the Commission considers that a provider of an online platform cannot solely rely on a statement in its terms and conditions prohibiting access to minors, to argue that the platform is not accessible to them. If the provider of the online platform does not implement effective measures to prevent minors from accessing its service, it cannot claim that its online platform falls outside the scope of Article 28(1) of Regulation (EU) 2022/2065 based on that declaration. For example, providers of online platforms that host and disseminate adult content, such as online platforms disseminating pornographic content, and therefore restrict, in their terms and conditions, the use of their service to users over the age of 18 years old, will be considered accessible to minors within the meaning of Article 28(1) of Regulation (EU) 2022/2065 when no effective measures have been put in place to prevent minors from accessing their service. |
|
7. |
As regards the third scenario, recital 71 of Regulation (EU) 2022/2065 clarifies that one example of a situation in which a provider of an online platform should be aware that some of the recipients of its service are minors is where that provider already processes the personal data of those recipients revealing their age for other purposes, such as during registration in the relevant service, and this reveals that some of those recipients are minors. Other examples of situations in which a provider can reasonably be expected to be aware that minors are amongst the recipients of its service include those in which the online platform is known to appeal to minors; the provider of the online platform offers similar services to those used by minors; the online platform is promoted to minors; the provider of the online platform has conducted or commissioned research that identifies minors as recipients of the services or where such identification results from an independent research. |
|
8. |
Pursuant to Article 19 of Regulation (EU) 2022/2065, the obligation laid down in Article 28(1) of Regulation (EU) 2022/2065 does not apply to providers of online platforms that qualify as micro or small enterprises, except where their online platform has been designated by the Commission as a very large online platform in accordance with Article 33(4) of that Regulation (9). |
|
9. |
Other provisions of Regulation (EU) 2022/2065 also aim at ensuring the protection of minors online (10). These include, among others, several provisions in Section 5 of Chapter III of Regulation (EU) 2022/2065, which imposes additional obligations on providers of very large online platforms (‘VLOPs’) and very large online search engines (‘VLOSEs’) (11). These guidelines do not aim to interpret those provisions and providers of VLOPs and VLOSEs should not expect that adopting the measures described below, either partially or in full, suffices to ensure compliance with their obligations under Section 5 of Chapter III of Regulation (EU) 2022/2065, as those providers may need to put in place additional measures which are not set out in these guidelines and which are necessary for them to comply with the obligations stemming from those provisions (12). |
|
10. |
Article 28(1) of Regulation (EU) 2022/2065 should also be seen in the light of other Union legislation and non-binding instruments which aim to address the risks to which minors are exposed online (13). Those instruments also contribute to achieving the objective of ensuring a high level of privacy, safety and security of minors online, and thus complement the application of Article 28(1) of Regulation (EU) 2022/2065. These guidelines should not be understood as interpreting or pre-empting any obligations arising under those instruments or under Member State legislation. Supervision and enforcement of those instruments remain the sole responsibility of the competent authorities under those legal frameworks. In particular, as clarified in recital 10 of Regulation (EU) 2022/2065, that Regulation is without prejudice to other acts of Union law regulating the provision of information society services in general, regulating other aspects of the provision of intermediary services in the internal market or specifying and complementing the harmonised rules set out in Regulation (EU) 2022/2065, such as Directive 2010/13/EU, as well as Union law on consumer protection and on the protection of personal data, in particular Regulation (EU) 2016/679. |
|
11. |
While these guidelines set out measures that aim at ensuring a high level of privacy, safety and security for minors online, providers of online platforms are encouraged to adopt those measures for the purposes of protecting all users, and not just minors. Creating a privacy preserving, safe and secure online environment for all users will inherently result in more privacy, safety and security for minors online, while adopting measures ensuring the respect of their specific rights and needs in line with Article 28 of Regulation (EU) 2022/2065. |
|
12. |
By adopting these guidelines, the Commission declares that it will apply these guidelines to the cases described therein and thus impose a limit on the exercise of its discretion whenever applying Article 28(1) of Regulation (EU) 2022/2065. As such, these guidelines may therefore be considered a significant and meaningful benchmark on which the Commission will base itself when applying Article 28(1) of Regulation (EU) 2022/2065 and determining the compliance of providers of online platforms accessible to minors with that provision (14). The Digital Services Coordinators and competent national authorities may also draw inspiration from these guidelines when applying and interpreting Article 28(1) of Regulation (EU) 2022/2065. Nevertheless, adopting and implementing the measures set out in these guidelines, either partially or in full, shall not automatically entail compliance with that provision. |
|
13. |
Any authoritative interpretation of Article 28(1) of Regulation (EU) 2022/2065 may only be given by the Court of Justice of the European Union, which amongst others has jurisdiction to give preliminary rulings concerning the validity and interpretation of EU acts, including Article 28(1) of Regulation (EU) 2022/2065. |
|
14. |
Throughout the development of the guidelines the Commission has consulted with stakeholders (15), including with the Board and its working group on protection of minors. In accordance with Article 28(4) of Regulation (EU) 2022/2065, the Commission consulted the Board on a draft of these guidelines prior to their adoption on 2 July 2025. |
|
15. |
The measures described in Sections 5 to 8 of these guidelines are not exhaustive. Other measures may also be deemed appropriate and proportionate to ensure a high level of privacy, safety and security for minors in accordance with Article 28(1) of Regulation (EU) 2022/2065, such as measures resulting from compliance with other pieces of Union legislation (16) or adherence to national guidance on the protection of minors or technical standards (17). In addition, new measures may be identified in the future that enable providers of online platforms accessible to minors to better comply with their obligation to ensure a high level of privacy, safety and security of minors on their service. |
3. STRUCTURE
|
16. |
Section 4 of these guidelines sets out the general principles which should govern all measures that providers of online platforms accessible to minors put in place to ensure a high level of privacy, safety, and security of minors on their service. Sections 5 to 8 of these guidelines set out the main measures that the Commission considers that such providers should put in place to ensure such a high level of privacy, safety and security. These include Risk review (Section 5), Service design (Section 6), Reporting, user support and tools for guardians (Section 7) and Governance (Section 8). |
4. GENERAL PRINCIPLES
|
17. |
The present guidelines are based on the following general principles, which are interrelated and should be considered holistically in all the activities by providers of online platforms that are in scope of these guidelines. The Commission considers that any measure that a provider of an online platform accessible to minors puts in place to comply with Article 28(1) of Regulation (EU) 2022/2065 should adhere to the following general principles.
|
5. RISK REVIEW
|
18. |
The heterogeneous nature of online platforms and diversity of contexts may require distinct approaches, with certain measures being better suited to some platforms over others. Where a provider of an online platform accessible to minors is deciding how to ensure a high level of safety, privacy and security to minors on its platform, and determining the appropriate and proportionate measures for that purpose, the Commission considers that that provider should, at a minimum, identify and take into account:
|
|
19. |
When conducting this review, providers of online platforms accessible to minors should take into consideration the best interests of the child as a primary consideration (25) in line with the Charter and other UNCRC principles (26), as well as to other relevant Union guidance on the matter (27). They should include the perspectives of children by seeking their participation, as well as that of guardians, representatives of other potentially impacted groups and other relevant experts and stakeholders. |
|
20. |
Providers should consider the most up-to-date available information and insight from scientific and academic sources, including by leveraging other relevant assessments conducted by the provider. They should adhere to the precautionary principle when there is reasonable indication that a particular practice, feature or design choice poses risks to children, taking measures to prevent or mitigate such risks until there is evidence that its effects are not harmful to children. |
|
21. |
Providers should carry out the review periodically, and at least on an annual basis or whenever they make significant changes to the platform’s design (28) or become aware of other circumstances that affect the platform’s design and operation relevant for ensuring a high level of privacy, safety and security of minors on their online platform. Providers should make the risk review available to the relevant supervisory authorities and publish its outcomes without disclosing sensitive operational or security-related information at the latest before the following review is performed, as well as consider submitting it to the review of independent experts or relevant stakeholders. |
|
22. |
Existing standards and tools to carry out child rights impact assessments can support providers in carrying out this review. These include, for example, the templates, forms and other guidance provided by UNICEF (29), the Dutch Ministry of the Interior and Kingdom Relations (BZK) (30), or the European standardisation body CEN-CENELEC (31). The Commission may issue additional guidance or tools to support providers in carrying out the review, including through specific tools for child rights impact assessments. Until the publication of this guidance, providers can use existing tools and best practices for these assessments. |
|
23. |
For providers of VLOPs and VLOSEs this risk review can also be carried out as part of the general assessment of systemic risks under Article 34 of Regulation (EU) 2022/2065, which will complement and go beyond the risk review pursued in accordance with the present guidelines. |
6. SERVICE DESIGN
6.1. Age assurance
6.1.1. Introduction and terminology
|
24. |
In recent years, technology has seen fast developments allowing providers of online platforms to assure themselves in more and less accurate, reliable and robust ways of the age of their users. These measures are commonly referred to as ‘age assurance’ (32). |
|
25. |
The Commission considers measures restricting access based on age to be an effective means to ensure a high level of privacy, safety and security for minors on online platforms. For this purpose, age assurance tools can help providers to enforce access restrictions for users below a certain age, in order to protect minors from accessing age-inappropriate content online, such as gambling or pornography, or from being exposed to other risks such as grooming. |
|
26. |
Age assurance tools can also help providers to prevent adults accessing certain platforms that are designed for minors, except for when doing so for legitimate parental, educational, or supervisory purposes, thus reducing the risk of adults posing as minors and/or seeking to harm minors. |
|
27. |
Finally, age assurance tools can be used to underpin the age-appropriate design of the service itself, thereby fostering safer and more child-suitable online spaces. In these instances, the tools can be used to ensure that children only have access to certain content, features or activities that are appropriate for their consumption, taking into account their age and evolving capacities. |
|
28. |
It is important to distinguish between, on the one hand, the age restriction that limits access to the platform or to parts thereof to users below or above a certain age, and, on the other hand, the age assurance methods that are used to determine a user’s age. |
|
29. |
The most common age assurance measures currently available and applied by online platforms fall into three broad categories: self-declaration, age estimation, and age verification.
|
|
30. |
The main difference between age estimation and age verification measures is the level of accuracy. Whereas age verification provides certainty about the age of the user, age estimation provides an approximation of the user’s age. The accuracy of age estimation technologies may vary and improve as technology progresses. |
6.1.2. Determining whether to put in place access restrictions supported by age assurance measures
|
31. |
Before deciding whether to put in place any access restrictions based on age, supported by age assurance methods, providers of online platforms accessible to minors should always conduct an assessment to determine whether such a measure is appropriate to ensure a high level of privacy, safety and security for minors on their service and whether it is proportionate, or whether such a high level may be achieved already by relying on other less far-reaching measures (34). In this regard, the Commission is of the view that providers should consider access restrictions based on age, supported by age assurance measures as a complementary tool to measures set out in other sections of these guidelines. In other words, access restrictions and age assurance alone cannot be substitutes for measures recommended elsewhere in these guidelines. |
|
32. |
Such an assessment should ensure that any restriction to the exercise of fundamental rights and freedoms of the recipients, especially minors, is proportionate. Consequently, the Commission considers that providers of online platforms should make the result of such an assessment publicly available on the online interface of its service, both if the assessment concludes that no access restriction supported by age assurance is required or that such a restriction would be an appropriate and proportionate measure. |
|
33. |
The Commission notes that a lower accuracy of age estimation solutions does not automatically equate to a lower impact on the fundamental rights and freedoms of recipients, as less accurate solutions may process more personal data than more accurate ones. They may also prevent some children from accessing online platforms that they may otherwise be able to access due to the lower level of accuracy. Therefore, when considering age estimation methods that require the processing of personal data, providers of online platforms accessible to minors should ensure that data protection principles, especially data minimisation, are properly implemented and remain robust over time and take into account the European Data Protection Board (EDPB) statement on Age Assurance (35). |
|
34. |
The Commission is of the view that, in order to ensure a high level of privacy, safety and security of minors on their services, providers of online platforms accessible to minors that consider access restrictions based on age assurance methods necessary and proportionate should provide information about any age assurance solutions they identified and their adequacy and effectiveness. They should also provide an overview of the performance metrics used to measure this, such as false positive and false negative rates, and accuracy and recall rates. |
|
35. |
Participation of children in the design, implementation, and evaluation of age restrictions and age assurance methods should be foreseen. |
|
36. |
Online platforms accessible to minors might have only some content, sections, or functions that pose a risk to minors or may have parts of their platform where the risk can be mitigated by other measures and/or parts where it cannot. In these cases, instead of age-restricting the service as a whole, providers of such online platforms should assess which content, sections or functions on their platform carry risks for minors and implement access restrictions supported by age assurance methods to reduce these risks for minors in proportionate and appropriate ways. For example, parts of social media services with content, sections or functions that may pose a risk to minors, such as adult-restricted sections of a social media, or sections with adult-restricted commercial communications or adult-restricted product placements by influencers should only be made available to adult users whose age has been verified accordingly. |
6.1.3. Determining which age assurance methods to use
6.1.3.1. Age verification
|
37. |
In the following circumstances, in view of the fact that the protection of minors constitutes an important policy objective of the Union to which Regulation (EU) 2022/2065 gives an expression, as reflected in its Recital 71, the Commission considers the use of access restrictions supported by age verification methods an appropriate and proportionate measure to ensure a high level of privacy, safety, and security of minors:
|
|
38. |
Age estimation methods can complement age verification technologies and can be used in addition to the former, or as temporary alternative in particular in cases where verification measures that respect the criteria of effectiveness of age assurance solutions outlined in Section 6.1.4, with particular emphasis on protecting users’ right to privacy and data protection as well as accuracy, are not yet readily available. This transitory period should not extend beyond the first review of these guidelines (38). For example, platforms offering adult-restricted content may use ex ante age estimation methods if they can prove that such methods are comparable to those of age verification, in respect of the criteria set out in Section 6.1.4, in the absence of effective age verification measures (39). The Commission may in due course supplement the present guidelines with a technical analysis on the main existing methods of age estimation that are currently available in view of the criteria outlined in Section 6.1.4. |
6.1.3.2. Age verification technologies
|
39. |
Age verification should be treated as a separate, distinct process that is not connected with other data collection activities exercised by online platforms. Age verification should not entitle providers of online platforms to store personal data beyond information about the user’s age group. |
|
40. |
As further elaborated under Section 6.1.4, any age assurance method should be robust, thus not easily circumventable, to be considered appropriate and proportionate. A method that is easy for minors to circumvent will not be considered an effective age assurance measure. |
|
41. |
Methods that rely on verified and trusted government-issued IDs, without providing the platform with additional personal data, may constitute an effective age verification method, in so far as they are based on anonymised age tokens (40). Such tokens should be issued after reliable verification of the person’s age, and they should be issued by an independent third-party rather than the provider of the online platform, especially when it offers access to adult content. The Commission considers that cryptographic protocols such as key rotation or zero-knowledge proofs (41) constitute a suitable basis for providing age assurance without transmitting personal data. |
|
42. |
Member States are currently in the process of providing each of their citizens, residents and businesses an EU Digital Identity Wallet (42). The upcoming EU Digital Identity Wallets provide safe, reliable, and private means of electronic identification within the Union. Once they are deployed, they may be used to share only specific information with a service, such as that a person is over a specified age.
|
|
43. |
To facilitate age verification before the EU Digital Identity Wallets become available, the Commission is currently testing an EU age verification solution as a standalone age verification measure that respects the criteria of effectiveness of age assurance solutions outlined in Section 6.1.4. Once finalised, the EU age verification solution will provide a compliance example and a reference standard for a device-based method of age verification. Providers of online platforms that are expected to use age verification solutions for their services, are therefore encouraged to participate in available testing of early versions of the EU age verification solution, which may inform those providers as to the best means of ensuring compliance with Article 28 of Regulation (EU) 2022/2065. |
|
44. |
Implementation of the reference standard (43) set by the EU age verification solution can be offered through apps published by public or private entities or integrated in the upcoming EU Digital Identity Wallets. Implementation of this standard will constitute an age verification technology that is privacy-preserving, data-minimising, non-traceable and interoperable, in compliance with the criteria of effectiveness of age assurance solutions outlined in Section 6.1.4.
|
|
45. |
Providers of online platforms accessible to minors may use other age verification methods to ensure a high level of privacy, safety, and security of minors, provided that they are compatible with the EU reference standard (as described in paragraphs 43 and 44 above) and meet the criteria outlined in Section 6.1.4. The EU age verification solution is an example of a method meeting those criteria. |
|
46. |
To ensure compliance with the principles of data minimisation, purpose limitation, and user trust, providers of online platforms are encouraged to adopt double-blind age verification methods. A double-blind method ensures that (i) the online platform does not receive additional means to identify the user and, instead only receives information allowing it to confirm whether they meet the required age threshold and that (ii) the age verification provider does not obtain knowledge of the services for which the proof of age is used. Such methods may rely on local device processing, anonymised cryptographic tokens, or zero-knowledge proofs (44). |
6.1.3.3. Age estimation
|
47. |
The Commission considers the use of age estimation methods, when provided by an independent third party or through systems appropriately and independently audited notably for security and data protection compliance, as well as when done ex ante if necessary to ensure the effectiveness of the measure, to be an appropriate and proportionate measure to ensure a high level of privacy, safety, and security of minors in the following circumstances:
|
|
48. |
Where the provider of an online platform accessible to minors has determined that access restrictions supported by age assurance are necessary to achieve a high level of privacy, safety and security for minors on their service, the Commission considers that it should offer on its platform more than one age assurance method, to provide the user with a choice between methods, provided that any such method meets the criteria outlined in Section 6.1.4. This will help to avoid the exclusion of users who, despite being eligible to access an online platform, cannot avail themselves of a specific age assurance method. In order to increase effectiveness and user-friendliness, the appropriate age assurance method should be carried out, where possible at account creation, and the age information then used to contribute to an age-appropriate experience on the platform, in addition to other protective measures mentioned in these guidelines. Furthermore, providers of online platforms should provide a redress mechanism for users to complain about any incorrect age assessments by the provider (49).
|
6.1.4. Assessing the appropriateness and proportionality of any age assurance method
|
49. |
Before considering whether to put in place a specific age verification or estimation method supporting access restrictions, providers of online platforms accessible to minors should consider the following features of that method:
|
|
50. |
Where age assurance measures do not achieve the criteria set out above, they cannot be deemed to be appropriate and proportionate. |
|
51. |
Age assurance solutions which can be easily circumvented should not be considered as ensuring a high level of privacy, safety and security for minors. Such assessment should be conducted depending on the impact that the platform may have on the privacy, safety and security of minors. The storage of a proof of age should also depend on the risks associated with the relevant platforms. For example, adult-restricted online platforms should not allow sharing of user account credentials and thus conduct age assurance at each instance when their service is accessed. |
|
52. |
The Commission considers that self-declaration (54) does not meet all the requirements above, in particular the requirement for robustness and accuracy. Therefore, it does not consider self-declaration to be an appropriate age assurance method to ensure a high level of privacy, safety, and security of minors in accordance with Article 28(1) of Regulation (EU) 2022/2065. |
|
53. |
Furthermore, the Commission considers that the fact that a third party is used to carry out age assurance should be explained to minors – as in any case – in an accessible, visible way and in a child-friendly language (see Section 8.4 on Transparency). In addition, it remains the responsibility of the provider to ensure that the method used by the third party is effective, in line with the considerations set out above. This includes, for example, where the provider intends to rely on solutions provided by operating systems or device operators. |
6.2. Registration
|
54. |
Registration or authentication may influence whether and how minors are able to access and use a given service in a safe, age-appropriate and rights-preserving way. The Commission is of the view that, when it has been determined that age assurance is necessary in order to provide a high level of privacy, safety and security, as well as to provide an age-appropriate experience, registration or authentication can be a first point of use to carry out such process in a proportionate way. |
|
55. |
Where registration is not required, and cognizant of the fact that any unregistered user could be a minor below the minimum age required by the online platform to access the service and/or age-inappropriate content on the service, the provider of the relevant online platform accessible to minors should configure the settings of any unregistered users in a way which guarantees the highest levels of privacy, safety and security, considering in particular the recommendations set out in Sections 6.3.1 and 6.3.2 and treating the best interests of the child as a primary consideration, including having regard to contact risks associated with an adult potentially posing as a child. |
|
56. |
Where registration is required or offered as a possibility to access an online platform accessible to minors, the Commission considers that the provider of that platform should:
|
6.3. Account settings
6.3.1. Default settings
|
57. |
Default settings are an important tool that providers of online platforms accessible to minors may use to mitigate risks to minors’ privacy, safety and security, such as, for example, the risk of unwanted contact by individuals seeking to harm minors. Evidence suggests that users tend not to change their default settings, which means that the default settings remain for most users and thus become crucial in driving behaviour (57). The Commission therefore considers that providers of online platforms accessible to minors should:
|
|
58. |
Where minors change their default settings or opt into features that put their privacy, safety or security at risk, the Commission considers that the provider of online platform should:
|
6.3.2. Availability of settings, features and functionalities
|
59. |
The Commission considers that providers of online platforms accessible to minors should:
|
6.4. Online interface design and other tools
|
60. |
The Commission considers that measures allowing minors to take control of their online experiences are an effective means of ensuring a high level of privacy, safety and security of minors for the purposes of Article 28(1) of Regulation (EU) 2022/2065. |
|
61. |
Without prejudice to the obligations of providers of VLOPs and VLOSEs under Section 5 of Chapter III of Regulation (EU) 2022/2065 and independently of the providers of online platforms’ obligations as regards the design, organisation and operation of their online interfaces deriving from Article 25 of that Regulation, the Commission considers that providers of online platforms accessible to minors should adopt and implement functionalities allowing minors to decide how to engage with their services. These functionalities should provide the right balance between child agency and an adequate level of privacy, safety and security. This should include, for example:
|
6.5. Recommender systems and search features
|
62. |
Recommender systems (62) determine the manner in which information is prioritised, optimised and displayed to minors. As a result, such systems have an important impact on whether and to what extent minors encounter certain types of content, contacts or conducts online. Recommender systems may pose and exacerbate risks to minors’ privacy, safety and security online by, for example, amplifying content that can have a negative impact on minors’ safety and security (63). |
|
63. |
The Commission recalls the obligations for all providers of all categories of online platforms concerning recommender system transparency under Article 27 of Regulation (EU) 2022/2065 and the additional requirements for providers of VLOPs and VLOSEs under Articles 34 (1), 35(1), and 38 of Regulation (EU) 2022/2065 in this respect (64). |
|
64. |
In order to ensure a high level of privacy, safety and security specifically for minors as required under Article 28 (1) of Regulation (EU) 2022/2065, the Commission considers that providers of online platforms accessible to minors should put in place the following measures: |
6.5.1. Testing and adaptation of the design and functioning of recommender systems for minors
|
65. |
Providers of online platforms accessible to minors that use recommender systems in the provision of their service should:
|
6.5.2. User control and empowerment
|
66. |
Providers of online platforms accessible to minors that use recommender systems, in the provision of their service should adopt the following measures to ensure a high level of privacy, safety and security of minors:
|
|
67. |
In addition to the obligations set out in Article 27(1) of Regulation (EU) 2022/2065, and for providers of VLOPs and VLOSEs the enhanced due diligence obligations laid down in Articles 34, 35 and 38 of that Regulation, the Commission considers that providers of online platforms accessible to minors should:
|
6.6. Commercial practices
|
68. |
Minors are particularly exposed to the persuasive effects of commercial practices and have a right to be protected against economically exploitative practices (68) by online platforms. They are confronted with commercial practices by online platforms, facing diverse, dynamic and personalised persuasive tactics through, for example, advertisement, product placements, the use of in-app currencies, influencer marketing, sponsorship or AI-enhanced nudging (69) (70). This can have a negative effect on minors’ privacy, safety and security when using the services of an online platform. |
|
69. |
In line with, and without prejudice to, the existing horizontal legal framework, in particular the Unfair Commercial Practices Directive 2005/29/EC that is fully applicable to all commercial practices also towards minors (71) and the more specific rules in Regulation (EU) 2022/2065 on advertising (Articles 26, 28(2) and 39) and dark patterns (Article 25), the Commission considers that providers of online platforms accessible to minors should adopt the following measures to ensure a high level of privacy, safety, and security of minors, on their service for the purposes Article 28(1) of Regulation (EU) 2022/2065:
|
6.7. Moderation
|
70. |
Moderation can reduce minors’ exposure to content and behaviour that is harmful to their privacy, safety and security, including illegal content or content that may impair their physical or mental development, and it can contribute to crime prevention. |
|
71. |
The Commission recalls the obligations related to: terms and conditions set out in Article 14 of Regulation (EU) 2022/2065; transparency reporting provided in Article 15 of that Regulation for providers of intermediary services, which includes providers of online platforms; notice and action mechanisms and statements of reasons provided respectively in Article 16 and 17 of that Regulation for providers of hosting services, including online platforms; the obligations related to trusted flaggers (85) for providers of online platforms set out in Article 22 of that Regulation. It also recalls the 2025 Code of Conduct on Countering Illegal Hate Speech Online+ and the Code of Conduct on Disinformation which constitute Codes of Conduct within the meaning of Article 45 of Regulation (EU) 2022/2065. |
|
72. |
In addition to those obligations, the Commission considers that providers of online platforms accessible to minors should put in place the following measures to ensure a high level of privacy, safety, and security of minors on their service for the purposes Article 28(1) of Regulation (EU) 2022/2065, while taking the best interests of the child as a primary consideration:
|
|
73. |
Providers of online platforms accessible to minors should share metrics on content moderation, for example how often they receive user reports, how often they proactively detect content and conduct violations, the types of content and conduct being reported and detected and how the platform responded to these issues. |
|
74. |
None of the above measures should result in a general obligation to monitor content which providers of online platforms accessible to minors either transmit or store (87).
|
7. REPORTING, USER SUPPORT AND TOOLS FOR GUARDIANS
7.1. User reporting, feedback and complaints
|
75. |
Effective, visible and child-friendly user reporting, feedback and complaint tools enable minors to express and address features of online platforms that may negatively affect the level of their privacy, safety and security. |
|
76. |
The Commission recalls the obligations laid down in Regulation (EU) 2022/2065, including the obligations to put in place notice and action mechanisms in Article 16, to provide a statement of reasons in Article 17, to notify suspicions of criminal offence in Article 18, to put in place an internal complaint-handling system in Article 20 and out of court dispute settlement in Article 21, as well as the rules on trusted flaggers in Article 22. |
|
77. |
In addition to those obligations, the Commission considers that providers of online platforms accessible to minors should put in place the following measures to ensure a high level of privacy, safety, and security of minors on their service for the purposes Article 28(1) of Regulation (EU) 2022/2065:
|
7.2. User support measures
|
78. |
Putting in place features on online platforms accessible to minors to assist minors to navigate their services and seek support where needed are an effective means to ensure a high level of privacy, safety and security for minors. The Commission therefore considers that providers of online platforms accessible to minors should:
|
7.3. Tools for guardians
|
79. |
Tools for guardians are software, features, functionalities, or applications designed to help guardians accompany their minor’s online activity, privacy, safety and well-being, while respecting children’s agency and privacy. |
|
80. |
The Commission considers that tools for guardians should be treated as complementary to safety by design and default measures and to any other measures put in place to comply with Article 28(1) of Regulation (EU) 2022/2065, including those described in these guidelines. Compliance with the obligation of providers of online platforms accessible to minors to ensure a high level of privacy, safety and security on their services must never rely exclusively on tools for guardians. Tools for guardians should not be used as the sole measure to ensure a high level of privacy, safety and security of minors on online platforms, nor be used to replace any other measures put in place for that purpose. Such measures may fail to reflect the realities of children’s lives, particularly in cases of split custody, foster care, or where guardians are absent or disengaged. Moreover, the effectiveness of parental consent is limited when the identity or legal authority of the consenting adult is not reliably verified. Providers of online platforms accessible to minors must therefore implement appropriate measures to protect minors and should not be restricted to relying on parental oversight. Nevertheless, the Commission notes that, when used in combination with other measures, tools for guardians may contribute to such a high level. |
|
81. |
Therefore, the Commission considers that providers of online platforms accessible to minors should put in place guardian control tools for the purposes Article 28(1) of Regulation (EU) 2022/2065 which should:
|
|
82. |
Tools for guardians may include features for managing default settings, setting screen time limits (see Section 6.4 on Online interface design and other tools), seeing the accounts that the minor communicates with, managing account settings, setting spending limits for the minor by default where applicable, or other features to supervise uses of the online platforms that may be detrimental to the minor’s privacy, safety and security. |
8. GOVERNANCE
|
83. |
Good platform governance is an effective means to ensure that the protection of minors is duly prioritised and managed across the platform, thus contributing to ensuring the required high level of privacy, safety and security of minors. |
8.1. Governance (general)
|
84. |
The Commission considers that providers of online platforms accessible to minors should put in place effective governance practices as a means of ensuring a high level of privacy, safety and security for minors on their services for the purposes Article 28(1) of Regulation (EU) 2022/2065. This includes, but is not limited to:
|
8.2. Terms and conditions
|
85. |
Terms and conditions provide a framework for governing the relationship between the provider of the online platform and its users. They set out the rules and expectations for online behaviour and play an important role in establishing a safe, secure and privacy respecting environment (96). |
|
86. |
The Commission recalls the obligations for all providers of intermediary services as regards terms and conditions set out in Article 14 of Regulation (EU) 2022/2065, which includes the obligation for providers of intermediary services to explain the conditions for, and any restrictions on, the use of the service in a clear, plain, intelligible, user-friendly and unambiguous language. In addition, Article 14(3) of that Regulation specifies that intermediary services primarily directed to minors or predominantly used by them, should provide this information in a way that minors can understand (97) (98). |
|
87. |
Moreover, the Commission considers that providers of online platforms accessible to minors should ensure that the terms and conditions of the service they provide:
|
|
88. |
In addition, the Commission considers that the providers of online platforms accessible to minors should ensure changes to the terms and conditions are logged and published (99).
|
8.3. Monitoring and evaluation
|
89. |
The Commission considers that providers of online platforms accessible to minors should adopt effective monitoring and evaluation practices to ensure a high level of privacy, safety and security for minors on their service for the purposes Article 28(1) of Regulation (EU) 2022/2065. This includes, but is not limited to:
|
8.4. Transparency
|
90. |
The Commission recalls the transparency obligations under Articles 14, 15 and 24 of Regulation (EU) 2022/2065. In view of minors’ developmental stages and evolving capacities, additional considerations concerning the transparency of an online platform’s functioning are required to ensure compliance with Article 28(1) of that Regulation. |
|
91. |
The Commission considers that providers of online platforms accessible to minors should make all necessary and relevant information on the functioning of their services easily accessible for minors to ensure a high level of privacy, safety and security on their services. It considers that providers of online platforms should make available to minors and, where relevant, their guardians, on an accessible interface on their online platforms the following information:
|
9. REVIEW
|
92. |
The Commission will review these guidelines as soon as this is necessary and at the latest after a period of 12 months, in view of practical experience gained in the application of Article 28(1) of Regulation (EU) 2022/2065 and the pace of technological, societal, and regulatory developments in this area. |
|
93. |
The Commission will encourage providers of online platforms accessible to minors, Digital Services Coordinators, national competent authorities, the research community and civil society organisations to contribute to this process. Following such a review, the Commission may, in consultation with the European Board for Digital Services, decide to amend these guidelines. |
(1) In the present guidelines, ‘child’, ‘children’ and ‘minor’ refer to a person under the age of 18.
(2) In the present guidelines, ‘guardians’, refer to persons holding parental responsibilities.
(3) Illegal content includes but is not limited to content depicting illicit drug trafficking, terrorist and violent extremist content and child sexual abuse material. What constitutes illegal content is not defined by Regulation (EU) 2022/2065 (the Digital Services Act) but by other laws either at EU level or at national level.
(4) A typology of risks to which minors are exposed when accessing online platforms, based on a framework developed by the OECD, is included in Annex I to these guidelines.
(5) See Section 6.1 on age assurance.
(6) Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market for Digital Services and amending Directive 2000/31/EC (Digital Services Act) (OJ L 277, 27.10.2022, p. 1, ELI: http://data.europa.eu/eli/reg/2022/2065/oj).
(7) Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (GDPR) (OJ L 119, 4.5.2016, p. 1, ELI: http://data.europa.eu/eli/reg/2016/679/oj).
(8) Article 3 of Regulation (EU) 2022/2065 defines ‘online platform’ as a hosting service that, at the request of a recipient of the service, stores and disseminates information to the public, unless that activity is a minor and purely ancillary feature of another service or a minor functionality of the principal service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation.
(9) Recommendation 2003/361/EC defines a small enterprise as an enterprise which employs fewer than 50 persons and whose annual turnover and/or annual balance sheet total does not exceed EUR 10 million. A microenterprise is defined as an enterprise which employs fewer than 10 persons and whose annual turnover and/or annual balance sheet total does not exceed EUR 2 million. The Commission recalls here Recital 10 of Regulation (EU) 2022/2065 which states that Regulation (EU) 2022/2065 is without prejudice to Directive (EU) 2010/13. The aforementioned Directive requires all video-sharing platform (VSP) providers, whatever its qualification as micro or small enterprises, to establish and operate age verification systems for users of video-sharing platforms with respect to content which may impair the physical or mental development of minors.
(10) This includes the obligations contained in the following provisions of Regulation (EU) 2022/2065: Article 14 on Terms and Conditions, Articles 16 and 17 on Notice and action mechanisms and Statement of Reasons, Article 25 on Online interface design and organisation, Articles 15 and 24 on Transparency, Article 26 on Advertisements, Article 27 on Recommender systems and Article 44 on Standards.
(11) This includes, but is not limited to, the following provisions of Regulation (EU) 2022/2065: Articles 34 and 35 on Risk assessment and Mitigation of risks, Article 38 on Recommender systems, Article 40 on Data access and scrutiny and Article 44 (j) on standards for targeted measures to protect minors online.
(12) This includes, but is not limited to, Articles 34 and 35 on Risk assessment and Mitigation of risks, Article 38 on Recommender systems and Article 40 on Data access and scrutiny.
(13) This includes the Better Internet for Kids strategy (BIK+), Directive 2010/13/EU (‘the Audiovisual Media Services Directive’), Regulation (EU) 2024/1689 (‘the AI Act’), Regulation (EU) 2016/679 (‘GDPR’), the Directive 2011/93/EU on combating the sexual abuse and sexual exploitation of children, the Directive 2005/29/EC on unfair commercial practices (the ‘UCPD’), the EU Digital Identity Wallet and the short-term age verification solution, the forthcoming action plan against cyberbullying, the EU-wide inquiry on the broader impacts of social media on well-being, the ProtectEU Strategy, the EU Roadmap to fight drug trafficking and organised crime, the EU Internet Forum, the EU Strategy for a more effective fight against child sexual abuse, the EU Strategy combating trafficking in human beings 2021-2025. Further, Regulation (EU) 2022/2065 is without prejudice to Union law on consumer protection and product safety, including Regulations (EU) 2017/2394 and (EU) 2019/1020 and Directives 2001/95/EC and 2013/11/EU. Directive 2005/29 on unfair commercial practices, notably Articles 5 to 9 also protect minors and, e.g., point 28 of Annex I prohibits, in an advertisement, a direct exhortation to children to buy advertised products or persuade their parents or other adults to buy advertised products for them. The Commission also recalls the European Commission Fitness Check of EU consumer law on digital fairness.
(14) Adopting and implementing any of the measures set out in these guidelines does not entail compliance with the GDPR or any other applicable data protection law. In determining compliance with Article 28(1) of Regulation (EU) 2022/2065, responsible authorities are therefore encouraged to cooperate with data protection authorities.
(15) The Commission has developed the guidelines by conducting thorough desk research, gathering stakeholder feedback through a call for evidence, workshops and targeted public consultation. The Commission also relied on the expertise of the European Centre of Algorithmic Transparency throughout the processes. Moreover, the Commission consulted with young people, including Better Internet for Kids youth ambassadors and organised focus groups with children in seven Member States, with the support of the Safer Internet Centres.
(16) This includes for example the Directives and Regulations cited in footnote 13, the forthcoming guidelines by the European Data Protection Board (EDPB) on processing of minor personal data in accordance with Regulation (EU) 2016/679 (GDPR).
(17) An Coimisiún um Chosaint Sonraí. (2021). Fundamentals for a child-oriented approach to data processing. Available: https://www.dataprotection.ie/sites/default/files/uploads/2021-12/Fundamentals%20for%20a%20Child-Oriented%20Approach%20to%20Data%20Processing_FINAL_EN.pdf; Coimisiún na Meán. (2024). Online safety code. Available: https://www.cnam.ie/app/uploads/2024/11/Coimisiun-na-Mean-Online-Safety-Code.pdf; IMY (Swedish Authority for Privacy Protection). (2021). The rights of children and young people on digital platforms. Available: https://www.imy.se/en/publications/the-rights-of-children-and-young-people-on-digital-platforms/; Dutch Ministry of the Interior and Kingdom Relations. (2022). Code for children's rights. Available: https://codevoorkinderrechten.waag.org/wp-content/uploads/2022/02/Code-voor-Kinderrechten-EN.pdf
CNIL. (2021). CNIL publishes 8 recommendations to enhance protection of children online. Available: https://www.cnil.fr/en/cnil-publishes-8-recommendations-enhance-protection-children-online; Unabhängiger Beauftragter für Fragen des sexuellen Kindesmissbrauchs. (n.d.). Rechtsfragen Digitales. Available: https://beauftragte-missbrauch.de/themen/recht/rechtsfragen-digitales; CEN-CENELEC (2023) Workshop Agreement 18016 Age Appropriate Digital Services Framework Available: https://www.cencenelec.eu/media/CEN-CENELEC/CWAs/ICT/cwa18016_2023.pdf; OECD. (2021). Children in the digital environment - Revised typology of risks. Available: https://www.oecd.org/en/publications/children-in-the-digital-environment_9b8f222e-en.html.
(18) These rights are elaborated by the United Nations Committee on the Rights of the Child as regards the digital environment in their General Comments No. 25. Office of the High Commissioner for Human Rights. (2021). General Comment No. 25 (2021) on children's rights in relation to the digital environment. Available: https://www.ohchr.org/en/documents/general-comments-and-recommendations/general-comment-no-25-2021-childrens-rights-relation.
(19) Children shall have the right to such protection and care as is necessary for their well-being (Article 24 of the Charter).
(20) They may express their views freely. Such views shall be taken into consideration on matters which concern them in accordance with their age and maturity (Article 24 of the Charter).
(21) At this regard, the Commission recalls the importance of accessibility, including as regulated in Directive (EU) 2016/2102 of the European Parliament and of the Council of 26 October 2016 on the accessibility of the websites and mobile applications of public sector bodies (‘Web Accessibility Directive’), as well as child participation throughout the design, implementation, and evaluation of all safety, security and privacy measures concerning children online.
(22) According to Article 25 GDPR, operators processing minors’ personal data must already implement appropriate organisational and technical measures to protect the rights of data subject (data protection by design and default). This obligation is enforced by the competent data protection authorities in line with Article 51 GDPR. See EDPB guidelines 4/2019 on Article 25 Data Protection by Design and by Default. Available: https://www.edpb.europa.eu/our-work-tools/our-documents/guidelines/guidelines-42019-article-25-data-protection-design-and_en.
(23) OECD (2024), Towards Digital Safety by Design for Children. Available: https://doi.org/10.1787/c167b650-en.
(24) This requires prioritising features, functionality, content or models that are compatible with children’s evolving capacities, as well as taking into consideration socio-cultural differences. Age-appropriate design is crucial for the privacy, safety and security of children: e.g. without age-appropriate information about it, children may be unable to understand, use or enjoy privacy or safety features, settings or other tools. CEN-CENELEC (2023) Workshop Agreement 18016 Age Appropriate Digital Services Framework , available https://www.cencenelec.eu/media/CEN-CENELEC/CWAs/ICT/cwa18016_2023.pdf; Ages and developmental stages available, among others as Annex to the Dutch Children’s Code. Available: https://codevoorkinderrechten.waag.org/wp-content/uploads/2022/02/Code-voor-Kinderrechten-EN.pdf.
(25) Article 3 of the UNCRC; Article 24 of the Charter: The right of the child to have his or her best interests assessed and taken as a primary consideration when different interests are being considered, in order to reach a decision on the issue at stake concerning a child, a group of identified or unidentified children or children in general. Best interests determinations, when necessary, should not be conducted by the companies, but based on competent authorities’ action. LSE Digital Futures for Children (2024), The Best interests of the child in the digital environment. Available: https://www.digital-futures-for-children.net/digitalfutures-assets/digitalfutures-documents/Best-Interests-of-the-Child-FINAL.pdf.
(26) Non-discrimination: Children’s rights apply to any child, without any discrimination, as per Article 21 of the Charter. The rights of the child, as per Article 24 of the Charter, include children’s right to such protection and care as is necessary for their well-being; children may express their views freely. Such views shall be taken into consideration on matters which concern them in accordance with their age and maturity; in all actions relating to children, whether taken by public authorities or private institutions, the child's best interests must be a primary consideration. The rights of children also include, as everyone, the right to life, as per Article 2 of the Charter and the right to respect for his or her physical and mental integrity, as per Article 3 of the Charter. The rights of the child are also enshrined and further detailed in the United Nations Convention on the Rights of the Child (‘the UNCRC’), which all Member States have ratified.
(27) The Commission recalls in particular the EDPB Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is ‘likely to result in a high risk’ for the purposes of Regulation (EU) 2016/679.
(28) Examples of significant changes are the introduction of new features affecting user interaction, modifications to recommender systems, account settings, moderation, reporting or other design features that would appreciably change children’s experience on the platform, changes in data collection practices, expansion to new user groups, integration of generative AI tools, or changes related to age assurance measures or their providers.
(29) UNICEF. (2024). Children's rights impact assessment: A tool to support the design of AI and digital technology that respects children's rights. Available: https://www.unicef.org/childrightsandbusiness/workstreams/responsible-technology/D-CRIA; (2021) MO-CRIA: Child Rights Impact Self-Assessment Tool for Mobile Operators, Available: https://www.unicef.org/reports/mo-cria-child-rights-impact-self-assessment-tool-mobile-operators.
(30) Dutch Ministry of the Interior and Kingdom Relations (BZK). (2024). Child Rights Impact Assessment (Fillable Form). Available: https://www.nldigitalgovernment.nl/document/childrens-rights-impact-assessment-fill-in-document/.
(31) See in particular chapter 14 of CEN-CENELEC (2023) Workshop Agreement 18016 Age Appropriate Digital Services Framework , Available: https://www.cencenelec.eu/media/CEN-CENELEC/CWAs/ICT/cwa18016_2023.pdf.
(32) European Commission: Directorate-General for Communications Networks, Content and Technology, Center for Law and Digital Technologies (eLaw), LLM, Raiz Shaffique, M. and van der Hof, S. (2024). Mapping age assurance typologies and requirements – Research report. Available: https://data.europa.eu/doi/10.2759/455338.
(33) Ibid; CEN-CENELEC. (2023). Workshop Agreement 18016 Age Appropriate Digital Services Framework : https://www.cencenelec.eu/media/CEN-CENELEC/CWAs/ICT/cwa18016_2023.pdf.
(34) The review of risks and child rights impact assessment tools outlined in Section 5 on Risk review can help providers of online platforms to conduct this assessment.
(35) See EDPB statement 1/2025 on Age Assurance. Available: https://www.edpb.europa.eu/our-work-tools/our-documents/statements/statement-12025-age-assurance_en.
(36) These risks can be identified via the review of risks set out in Section 5.
(37) In this context the Commission recalls the obligations on Member States stipulated by Directive (EU) 2015/1535 of the European Parliament and of the Council of 9 September 2015 laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services (OJ L 241, 17.9.2015, p. 1, ELI: http://data.europa.eu/eli/dir/2015/1535/oj) and the relevant procedures for draft technical regulations established therein.
(38) The Commission is currently testing an EU age verification solution to facilitate age verification to the standard required in these guidelines, before the EU Digital Identity Wallet becomes available. Other solutions, compatible with the standard set out in these guidelines may be available commercially, or in individual Member States but not in others. Providers of online platforms that prove this circumstance should anyway start testing and using age verification methods that respects the criteria of Section 6.1.4 as soon as this becomes available. This transitory period may be adjusted in light of the roll-out of the EU age verification solution.
(39) . An overview of different methods of age estimation is available in European Commission: Directorate-General for Communications Networks, Content and Technology, Center for Law and Digital Technologies (eLaw), LLM, Raiz Shaffique, M. and van der Hof, S. (2024) Mapping age assurance typologies and requirements – Research report. Available: https://data.europa.eu/doi/10.2759/455338.
(40) The service provider only needs to know whether the user is over or under an age threshold. This should be implemented by a tokenised approach based on the participation of a third-party provider, in which the service provider only sees the functional result of the age assurance process (e.g. ‘over’ or ‘under’ the age threshold). A third-party provider performs an age check and provides the user with an ‘age token’ that the user can present to the service provider without needing to prove their age again. The age token may contain different user’s attributes and information about when, where or how the age check was performed. See also EDPB statement 1/2025 on Age Assurance. Available: https://www.edpb.europa.eu/our-work-tools/our-documents/statements/statement-12025-age-assurance_en.
(41) A zero-knowledge proof is a protocol in which one party (the prover) can demonstrate another party (the verifier) that some given statement is true, without conveying to the verifier any information beyond the mere fact of that statement's truth.
(42) As provided for under Section 1 of Chapter II of Regulation (EU) No 910/2014, as amended by Regulation (EU) 2024/1183.
(43) The EU reference standard is available at https://ageverification.dev.
(44) Such methods are strongly aligned with the EDPB’s call in paragraph 34 of its Statement 1/2025 on Age Assurance for solutions that prevent linking and profiling. These privacy-preserving approaches are also favoured by academic research as scalable, inclusive, and effective for minimising risks to minors while respecting fundamental rights. Available: https://www.edpb.europa.eu/our-work-tools/our-documents/statements/statement-12025-age-assurance_en.
(45) Where age verification is used in these instances, it would be without prejudice to any separate obligations on the provider, e.g. requiring it to assess whether the minor as a consumer was old enough to legally enter into a contract. This depends on the applicable law of the Member State where the minor is resident.
(46) In some cases, it may be possible for the provider to verify that the minor was signed up by their guardians.
(47) These risks can be identified via the review of risks set out in Section 5.
(48) All good and poor practice examples in these guidelines refer to fictious online platforms.
(49) The provider may wish to integrate this mechanism into their internal complaint-handling system under Article 20. See also Section 7.1 of this document.
(50) Inaccurate age assurance may lead to the exclusion of recipients that would be as such eligible to use a service or allow ineligible recipients to access the service despite the age assurance measure in place.
(51) Inappropriate age assurance may create undue risks to recipients’ rights to data protection and privacy whereas blanket age assurance could limit access to services beyond what is actually necessary.
(52) See Recital 71 of Regulation (EU) 2022/2065 which highlights the need for providers to observe the data minimisation principle provided for in Article 5(1)(c) of Regulation (EU) 2016/679.
(53) See EDPB statement 1/2025 on Age Assurance point 2.3 and 2.4. Available at: https://www.edpb.europa.eu/our-work-tools/our-documents/statements/statement-12025-age-assurance_en.
(54) European Commission: Directorate-General for Communications Networks, Content and Technology, Center for Law and Digital Technologies (eLaw), LLM, Raiz Shaffique, M. and van der Hof, S. (2024) Mapping age assurance typologies and requirements – Research report. Available: https://data.europa.eu/doi/10.2759/455338.
(55) This is without prejudice to additional requirements stemming from other laws, such as Article 12 of Regulation (EU) 2016/679.
(56) As outlined in Section 6.1, the Commission does not consider self-declaration to be an appropriate age assurance method to ensure a high level of privacy, safety, and security of minors in accordance with Article 28(1) of Regulation (EU) 2022/2065.
(57) Willis, L. E. (2014). Why not privacy by default? Berkeley Technology Law Journal, 29(1), 61. Available: https://www.btlj.org/data/articles2015/vol29/29_1/29-berkeley-tech-l-j-0061-0134.pdf; Cho, H., Roh, S., & Park, B. (2019). Of promoting networking and protecting privacy: Effects of defaults and regulatory focus on social media users’ preference settings. Computers in Human Behavior, 101, 1-13. Available: https://doi.org/10.1016/j.chb.2019.07.001. Examples of settings that may put minors’ privacy, safety or security at risk include, but are not limited to, enabling location sharing, switching to a public profile, allowing other users to view their contact or follower lists, allowing sharing of media files, and hosting or participating in a live stream.
(58) Minors experience different developmental stages and have different levels of maturity and understanding at different ages. This is recognised among others in the UN Committee on the Rights of the Child General Comment No. 25 on children’s rights in relation to the digital environment 2021, para. 19-21. A practical table on ages and developmental stages is available, among others as Annex to the Dutch Children’s Code. Available at: https://codevoorkinderrechten.waag.org/wp-content/uploads/2022/02/Code-voor-Kinderrechten-EN.pdf.
(59) The Commission recalls that Directive 2005/29/EC prohibits unfair commercial practices, including in its Annex I, point 7, falsely stating that a product will only be available for a very limited time, or that it will only be available on particular terms for a very limited time, in order to elicit an immediate decision and deprive consumers of sufficient opportunity or time to make an informed choice.
(60) The Commission recalls the obligation for providers of AI systems that are intended to interact directly with natural persons to ensure these are designed and developed in such a way that natural persons concerned are informed they are interacting with an AI system according to Article 50(1) of Regulation (EU) 2024/1689 (‘the AI Act’). Any measure taken upon this recommendation should be understood according to and without prejudice with the measures taken to comply with Article 50(1) of the AI Act, including its own supervisory and enforcement regime.
(61) The Commission recalls that the Guidelines on prohibited artificial intelligence practices established by Regulation (EU) 2024/1689 (AI Act).
(62) For the purpose of this Section, the Commission recalls that, in accordance with Article 3(s) of Regulation (EU) 2022/2065, recommender systems include systems deployed for content recommendations, product recommendations, advertisement recommendations, contact recommendation, search autocomplete and results.
(63) Munn, L. (2020). Angry by design: Toxic communication and technical architectures. Humanities and Social Sciences Communications, 7(53). Available: https://doi.org/10.1057/s41599-020-00550-7; Milli, S. et al. (2025). Engagement, user satisfaction, and the amplification of divisive content on social media. PNAS Nexus, 4(3) pgaf062. Available: https://doi.org/10.1093/pnasnexus/pgaf062; Piccardi, T. et al. (2024). Social Media Algorithms Can Shape Affective Polarization via Exposure to Antidemocratic Attitudes and Partisan Animosity. Available: 10.48550/arXiv.2411.14652; Harriger, J. A., Evans, J. L., Thompson, J. K., & Tylka, T. L. (2022). The dangers of the rabbit hole: Reflections on social media as a portal into a distorted world of edited bodies and eating disorder risk and the role of algorithms. Body Image, 41, 292-297. Available: https://doi.org/10.1016/j.bodyim.2022.03.007; Amnesty International. (2023). Driven into darkness: How TikTok’s ‘For You’ feed encourages self-harm and suicidal ideation. Available: https://www.amnesty.org/en/documents/pol40/7350/2023/en/; Hilbert, M., Ahmed, S., Cho, J., & Chen, Y. (2024). #BigTech @Minors: Social media algorithms quickly personalize minors’ content, lacking equally quick protection. Available: http://dx.doi.org/10.2139/ssrn.4674573. Sala, A., Porcaro, L., Gómez, E. (2024). Social Media Use and adolescents' mental health and well-being: An umbrella review, Computers in Human Behavior Reports, Volume 14, 100404, ISSN 2451-9588. Available: https://doi.org/10.1016/j.chbr.2024.100404.
(64) The Commission also recalls that other Union or national law may impact the design and functioning of recommender systems, with a view to ensure protection of legal interests within their remits, which contribute to a high level of privacy, safety and protection of fundamental rights online.
(65) For example, minors’ feedback about content, activities, individuals, accounts or groups that make them feel uncomfortable or that they want to see more or less of should be taken into account in the ranking of the recommender systems. This includes feedback such as ‘Show me less/more’, ‘I don’t want to see/I am not interested in’, ‘I don’t want to see content from this account’, ‘This makes me feel uncomfortable’, ‘Hide this’, ‘I don’t like this’, or ‘This is not for me’. See also Section 7.1 on user reporting, feedback and complaints of the present guidelines.
(66) Examples of terms can be found in the Knowledge Package on Combating Drug Sales Online, which was developed as part of the EU Internet Forum and compiles more than 3 500 terms, emojis and slangs used by drug traffickers to sell drugs online - see reference in the EU Roadmap to fight against drug trafficking and organised crime (COM(2023) 641 final).
(67) See Articles 27(1) and (3) of Regulation (EU) 2022/2065.
(68) UN Committee on the Rights of the Child General Comment No. 25, para 112; UNICEF. (2019). Discussion paper: Digital marketing and children’s rights. Available: https://www.unicef.org/childrightsandbusiness/media/256/file/Discussion-Paper-Digital-Marketing.pdf.
(69) This makes it difficult for them, for instance, to distinguish between commercial and non-commercial content, to resist peer pressure to buy in-game or in-app content that are attractive for minors or even necessary to progress in the game, or to understand the real currency value of in-app currencies or that the occurrence of the most desirable content such as upgrades, maps and avatars may be less frequent in randomised in-app or in-game purchases than less desirable content.
(70) M. Ganapini, E. Panai (2023) An Audit Framework for Adopting AI-Nudging on Children. Available: https://arxiv.org/pdf/2304.14338.
(71) The Commission recalls that per its Article 2(4) Regulation (EU) 2022/2065, it is without prejudice to Directive 2010/13/EU, Union law on copyright and related rights, Regulation (EU) 2021/784, Regulation (EU) 2019/1148, Regulation (EU) 2019/1150, Union law on consumer protection and product safety (including Directive (EU) 2005/29 and Union law on the protection of personal data, Union law in the field of judicial cooperation in civil matters, Union law in the field of judicial cooperation in criminal matters and a Directive laying down harmonised rules on the appointment of legal representatives for the purpose of gathering evidence in criminal proceedings. Further, it shall not affect the application of Directive 2000/31/EC. Under Article 91 of Regulation (EU) 2022/2065, the Commission is mandated to evaluate and report, by 17th November 2025, on the way that this Regulation interacts with other legal acts, in particular the acts referred to above.
(72) UNICEF provides resources and guidance for platforms related to digital marketing ecosystem, including UNICEF (2025) Discussion Paper on digital marketing and children’s rights Available https://www.unicef.org/childrightsandbusiness/workstreams/responsible-technology/digital-marketing.
(73) The Commission recalls that, for instance, traders are subject to the prohibition under Directive 2005/29/EC Article 5(1) to commit unfair commercial practices and point 28 of Annex I of the Directive prohibits direct exhortation to children to buy advertised products or persuade their parents or other adults to do so. This commercial behaviour is in all circumstances considered unfair.
(74) Committee on the Rights of the Child’s General comment No. 25 (2021) on children’s rights in relation to the digital environment provides that the best interests of the child should be ‘a primary consideration when regulating advertising and marketing addressed to and accessible to children. Sponsorship, product placement and all other forms of commercially driven content should be clearly distinguished from all other content and should not perpetuate gender or racial stereotypes’.
(75) The Commission recalls that such AI systems could constitute prohibited practices under Article 5(1)(b) of Regulation (EU) 2024/1689, if they exploit vulnerabilities of children in a manner that causes or is reasonably likely to cause significant harm. Any measures taken according to this recommendation should go beyond measures taken to prevent the application of that prohibition. The supervision and enforcement of measures taken to comply with Article 50(1) of Regulation (EU) 2024/1689 remains the responsibility of the competent authorities under that Regulation.
(76) The Commission recalls that according to Article 6 and 7 of Directive 2005/29/EC, the disclosure of the commercial element must be clear and appropriate, taking into account the medium in which the marketing takes place, including the context, placement, timing, duration, language, or target audience. See also the Guidance on the interpretation and application of Directive 2005/29/EC.
(77) The Commission recalls that Directive 2005/29/EC Article 7(2), and in Annex I, point 22, prohibits falsely claiming or creating the impression that the trader is not acting for purposes relating to his trade, business, craft or profession, or falsely representing oneself as a consumer. It also recalls Directive 2010/13/EU that prohibits to directly exhort minors to buy or hire a product or service, encourage them to persuade their parents or others to purchase the goods or services being advertised, exploit the special trust minors place in parents, teachers or other persons. According to recital 10 of Regulation (EU) 2022/2065 the Regulation should be without prejudice to Union law on consumer protection including Directive 2005/29/EC concerning unfair business-to-consumer commercial practices in the internal market.
(78) The Commission also recalls that Directive 2010/13/EU provides that video sharing platforms need to have a functionality to declare that content uploaded contains audiovisual commercial communications.
(79) The Commission recalls that the concept of virtual currency is defined in Directive (EU) 2018/843 on anti-money-laundering.
(80) The example used in this recommendation is immaterial with respect to any legal classification or definition of in-game currencies in existing Union law and/or any interpretation thereof related to the implications of the use of such instruments.
(81) The Commission recalls that Directive 2005/29/EC in its Annex I, point 20, prohibits describing a product as ‘gratis’, ‘free’, ‘without charge’ or similar if the consumer has to pay anything other than the unavoidable cost of responding to the commercial practice and collecting or paying for delivery of the item.
(82) As set out in Article 25 of Regulation (EU) 2022/2065. The Commission recalls that according to Article 25(2) the prohibition in Article 25(1) shall not apply to practices covered by Directive 2005/29/EC or Regulation (EU) 2016/679.
(83) The Commission recalls that Directive 2005/29/EC in its Annex I, point 7, prohibits falsely stating that a product will only be available for a very limited time, or that it will only be available on particular terms for a very limited time, in order to elicit an immediate decision and deprive consumers of sufficient opportunity or time to make an informed choice. Thereby traders are subject to the prohibition to use scarcity techniques including scarcity techniques.
(84) The Commission recalls that, in the case of games, under Articles 8 and 9 of Directive 2005/29/EC traders should not exploit behavioural biases or introduce manipulative elements relating to, e.g. the timing of offers within the gameplay (offering micro-transactions during critical moments in the game), the use of visual and acoustic effects to put undue pressure on the player.
(85) Trusted flaggers are entities with particular expertise and competence in detecting certain types of illegal content, and the notices they submit within their designated area of expertise must be given priority and processed by providers of online platforms without undue delay. The trusted flagger status is awarded by the Digital Services Coordinator of the Member State where the entity is established, provided that the entity has demonstrated their expertise, competence, independence from online platforms, as well as diligence, accuracy and objectivity in submitting notices.
(86) The Commission recalls that such AI systems could constitute prohibited practices under Article 5(1)(b) of Regulation (EU) 2024/1689, if they exploit vulnerabilities of children in a manner that causes or is reasonably likely to cause significant harm. Any measures taken according to this recommendation should go beyond measures taken to prevent the application of that prohibition. The supervision and enforcement of measures taken to comply with Article 50(1) of Regulation (EU) 2024/1689 remains the responsibility of the competent authorities under that Regulation.
(87) See Article 8 of Regulation (EU) 2022/2065.
(88) See section 6.5.2 of the present guidelines for information about how this information should affect the provider’s recommender systems.
(89) Any reference in the remainder of this Section to ‘complaint’ or ‘complaints’ includes any complaints that are brought against the provider’s assessment of the user’s age and any complaints that are brought against the decisions referred to in Article 20 of Regulation (EU) 2022/2065. Article 20 of Regulation (EU) 2022/2065 requires providers of online platforms to provide recipients of the service with access to an effective internal complaint-handling system against four types of decisions taken by the provider of the online platform. These are (a) decisions whether or not to remove or disable access to or restrict visibility of the information; (b) decisions whether or not to suspend or terminate the provision of the service, in whole or in part, to the recipients; (c) decisions whether or not to suspend or terminate the recipients’ account; and (d) decisions whether or not to suspend, terminate or otherwise restrict the ability to monetise information provided by the recipients.
(90) Such as those that form part of the national Safer Internet Centres and INHOPE networks or other national child helplines such as https://childhelplineinternational.org/.
(91) UNICEF’s spotlight guidance on stakeholder engagement with children offers concrete steps on responsible child participation activities. UNICEF. (2025). Spotlight guidance on best practices for stakeholder engagement with children in D-CRIAs. Available: https://www.unicef.org/childrightsandbusiness/media/1541/file/D-CRIA-Spotlight-Guidance-Stakeholder-Engagement.pdf.
(92) This approach is in line with the Better Internet for Kids strategy (BIK+), which emphasises the importance of awareness and education in promoting online safety and supports the implementation of Regulation (EU) 2022/2065 in this respect. Furthermore, the Safer Internet Centres, stablished in each Member State, demonstrate the value of awareness-raising efforts in preventing and responding to online harms and risks.
(93) This training might cover, for example, children’s rights, risks and harms to minors’ privacy, safety and security online, as well as effective prevention, response and mitigation practices.
(94) An Coimisiún um Chosaint Sonraí. (2021). Fundamentals for a child-oriented approach to data processing. Available: https://www.dataprotection.ie/sites/default/files/uploads/2021-12/Fundamentals%20for%20a%20Child-Oriented%20Approach%20to%20Data%20Processing_FINAL_EN.pdf; Coimisiún na Meán. (2024). Online safety code. Available: https://www.cnam.ie/app/uploads/2024/11/Coimisiun-na-Mean-Online-Safety-Code.pdf; IMY (Swedish Authority for Privacy Protection). (2021). The rights of children and young people on digital platforms. Available: https://www.imy.se/en/publications/the-rights-of-children-and-young-people-on-digital-platforms/; Dutch Ministry of the Interior and Kingdom Relations. (2022). Code for children's rights. Available: https://codevoorkinderrechten.waag.org/wp-content/uploads/2022/02/Code-voor-Kinderrechten-EN.pdf; CNIL. (2021). CNIL publishes 8 recommendations to enhance protection of children online. Available: https://www.cnil.fr/en/cnil-publishes-8-recommendations-enhance-protection-children-online; Unabhängiger Beauftragter für Fragen des sexuellen Kindesmissbrauchs. (n.d.). Rechtsfragen Digitales. Available: https://beauftragte-missbrauch.de/themen/recht/rechtsfragen-digitales.
(95) CEN-CENELEC (2023) Workshop Agreement 18016 Age Appropriate Digital Services Framework ; OECD. (2021). Children in the digital environment – Revised typology of risks. https://www.oecd.org/en/publications/children-in-the-digital-environment_9b8f222e-en.html.
(96) The P2089.2™ Standard for Terms and Conditions for Children's Online Engagement provides processes and practices to develop terms and conditions that help protect the rights of children in digital spheres.
(97) The Commission also recalls the requirements for video-sharing platform providers to protect minors from programmes, user-generated videos and audiovisual commercial communications which may impair their physical, mental or moral development in Article 28b of Directive 2010/13/EU. These requirements are to be evaluated and, potentially, reviewed by 19 December 2026.
(98) As indicated in the Introduction of these guidelines, certain provisions of Regulation (EU) 2022/2065 including points (5) and (6) of article 14, impose additional obligations on providers of very large online platforms (‘VLOPs’). To the extent that the obligations expressed therein also relate to the privacy, safety and security of minors within the meaning of Article 28(1), the present guidelines build on these provisions.
(99) For example, by publishing them in the Digital services terms and conditions database: https://platform-contracts.digital-strategy.ec.europa.eu/
(100) As indicated in the Scope of these guidelines (Section 2), certain provisions of Regulation (EU) 2022/2065 including Section 5 of Chapter III impose additional obligations on providers of very large online platforms (‘VLOPs’) and very large search engines (‘VLOSEs’). To the extent that the obligations expressed therein also relate to the privacy, safety and security of minors within the meaning of Article 28(1), the present guidelines build on these provisions, and VLOPs should not expect that adopting the measures described in the present guidelines, either partially or in full, suffices to ensure compliance with their obligations under Section 5 of Chapter III of Regulation (EU) 2022/2065.
ANNEX
5C Typology of online risks to children
1.
The OECD (1) and researchers (2) have classified the risks (3) that minors can encounter online, in order for providers of online platforms accessible to minors, academia and policy makers to better understand and analyse them. This classification of risks is known as the 5Cs typology of online risks to children. It helps in identifying risks and includes 5 categories of risks: content, conduct, contact, consumer risks, cross-cutting risks. These risks may manifest when appropriate and proportionate measures are not in place to ensure a high level of privacy, safety and security for minors on the service, causing potential infringement of a number of children’s rights.
2.
5C typology of online risks to children (4)|
Risks for children in the digital environment |
||||
|
Risk categories |
Content |
Conduct |
Contact |
Consumer |
|
Cross-cutting risks |
Additional privacy, safety and security risks Advanced technology risks Risks on health and wellbeing Misuse risks |
|||
|
Risk manifestation |
Hateful content |
Hateful behaviour |
Hateful encounters |
Marketing risks |
|
Harmful content |
Harmful behaviour |
Harmful encounters |
Commercial profiling risks |
|
|
Illegal content |
Illegal behaviour |
Illegal encounters |
Financial risks |
|
|
Disinformation |
User-generated problematic behaviour |
Other problematic encounters |
Security risks |
|
3.
Content risks: Minors can be unexpectedly and unintentionally exposed to content that potentially harms them: a. hateful content; b. harmful content c; illegal content; d. disinformation. These types of content are widely considered to have serious negative consequences to minors’ mental health and physical wellbeing, for example content promoting self-harm, suicide, eating disorders or extreme violence.
4.
Conduct risks: Refer to behaviours minors may actively adopt online, and which can pose risks to both themselves and others such as a. hateful behaviour (e.g., minors posting/sending hateful content/messages); b. harmful behaviour (e.g., minors posting/sending violent or pornographic content); c. illegal behaviour (e.g., minors posting/sending child sexual abuse material or terroristic content); and d. user-generated problematic behaviour (e.g., participation in dangerous challenges; sexting).
5.
Contact risks: Refer to situations in which minors are victims of the interactions, as opposed to the actor: a. hateful encounters; b. harmful encounters (e.g. the encounter takes place with the intention to harm the minor); c. illegal encounters (e.g. can be prosecuted under criminal law); and d. other problematic encounters. Examples of contact risks include, but are not limited to, online grooming, online sexual coercion and extortion, sexual abuse via webcam, cyberbullying and trafficking in human beings for the purposes of sexual exploitation. These risks also extend to online fraud practices such as phishing, marketplace fraud, and identity theft.
6.
Consumer risks: Minors can also face risks as consumers in the digital economy: a. marketing risks (e.g. loot boxes, advergames.); b. commercial profiling risks (e.g. product placement or receiving advertisements intended for adults such as dating services); c. financial risks (e.g. fraud or spending large amounts of money on without the knowledge or consent of their guardians); d. security risks and e. risks related to the purchase and consumption of drugs, medicines, alcohol, and other illegal or dangerous products. Consumer risks also include risks related to contracts, for example the sale of users’ data or unfair terms and conditions.
7.
Cross cutting risks: These are risks that cut across all risk categories and are considered highly problematic as they may significantly affect minors’ lives in multiple ways. They are:|
(a) |
Advanced technology risks involve minors encountering new dangers as technology develops, such as AI chatbots that might provide harmful information or be used for grooming by exploiting vulnerabilities, or the use of biometric technologies that can lead to abuse, identity fraud and exclusion. |
|
(b) |
Health and wellbeing risks include potential harm to minors' mental, emotional, or physical well-being. For example, increased obesity/anorexia and mental health issues linked to the use or excessive use of online platforms, which may in some cases result in negative impacts for minors’ physical and mental health and wellbeing, such as addiction, depression, anxiety disorders, deregulated sleep patterns and social isolation. |
|
(c) |
Additional privacy and data protection risks stem from access to information about minors and the danger of geolocation features that predators could exploit to locate and approach minors. |
8.
Other cross cutting risks (5) can also include:|
(a) |
Additional safety and security risks relate to minors’ safety, particularly physical safety, as well as all cybersecurity issues. |
|
(b) |
Misuse risks relate to risks or harms to minors stemming from the misuse of the online platform, or its features. |
(1) OECD. (2021). Children in the digital environment – Revised typology of risks. https://www.oecd.org/en/publications/children-in-the-digital-environment_9b8f222e-en.html.
(2) Livingstone, S., & Stoilova, M. (2021). The 4Cs: Classifying Online Risk to Children. (CO:RE Short Report Series on Key Topics). Hamburg: Leibniz-Institut für Medienforschung | Hans-Bredow-Institut (HBI); CO:RE – Children Online: Research and Evidence. https://doi.org/10.21241/ssoar.71817.
(3) See also a risk analysis provided by the the Bundeszentrale für Kinder- und Jugendmedienschutz (BZKJ). (2022). Gefährdungsatlas. Digitales Aufwachsen. Vom Kind aus denken. Zukunftssicher handeln. Aktualisierte und erweiterte 2. Auflage. - Bundeszentrale für Kinder- und Jugendmedienschutz. Available: https://www.bzkj.de/resource/blob/197826/5e88ec66e545bcb196b7bf81fc6dd9e3/2-auflage-gefaehrdungsatlas-data.pdf.
(4) OECD. (2021). Children in the digital environment - Revised typology of risks. p.7. https://www.oecd.org/en/publications/children-in-the-digital-environment_9b8f222e-en.html.
(5) Livingstone, S., & Stoilova, M. (2021). The 4Cs: Classifying Online Risk to Children. (CO:RE Short Report Series on Key Topics). Hamburg: Leibniz-Institut für Medienforschung | Hans-Bredow-Institut (HBI); CO:RE – Children Online: Research and Evidence. https://doi.org/10.21241/ssoar.71817.
ELI: http://data.europa.eu/eli/C/2025/5519/oj
ISSN 1977-091X (electronic edition)