|
CALL FOR EVIDENCE FOR AN INITIATIVE (without an impact assessment) |
|
|
This document aims to inform the public and stakeholders about the Commission’s work, so they can provide feedback and participate effectively in consultation activities. We ask these groups to provide views on the Commission’s understanding of the problem and possible solutions, and to give us any relevant information they may have. |
|
|
Title of the initiative |
Digital Services Act – guidelines to enforce the protection of minors online |
|
Lead DG – responsible unit |
DG CNECT – CNECT F2 |
|
Likely Type of initiative |
EU regulation guidelines |
|
Indicative Timing |
Q2-2025 |
|
Additional Information |
– |
|
A. Political context, problem definition and subsidiarity check |
|
Political context |
|
The Digital Services Act (DSA) is a groundbreaking rulebook that entered into full effect in February 2024. It aims to make the online world safer, fairer and easier to navigate. To ensure a high level of privacy, safety and security for children, the DSA weaves a tight net around online platforms, obliging them to take appropriate and proportionate measures. Platforms are required to set up mechanisms through which users can flag illegal content. It obliges them to explain their terms and conditions clearly so that young users can understand them. The DSA also bans dark patterns, which are interfaces that deceive or manipulate users, and prohibits all online platforms from presenting advertisements to children based on profiling. Every year, the DSA also obliges the largest platforms and search engines that have more than 45 million monthly users in the EU to identify potential risks their services may have for minors and find ways to mitigate them proportionately and effectively. If platforms do not follow the rules, the Commission or the Member States (through the national Digital Services Coordinators) have strong supervision and enforcement powers. The Commission can, for example, fine the biggest platforms up to 6% of their annual turnover. |
|
Problem the initiative aims to tackle |
|
Today’s digital world plays a huge role in children’s lives, and they enjoy spending time online to connect and share content with friends, watch videos, play games, be creative, learn new things, etc. Although being online brings many new opportunities for entertainment, learning and connecting with friends, it also means that children are confronted with complex risks. To assist online platforms in ensuring they comply with the DSA’s requirement to provide a high level of privacy, safety and security for minors and to ensure a harmonised implementation of the rules in all EU countries, the Commission intends to issue guidelines on the protection of minors online. These guidelines will provide a non-exhaustive list of good practices and recommendations for online platform providers to help them mitigate risks related to the protection of minors and ensure a high level of privacy, safety and security for children. The guidelines will also help the Commission and the Digital Services Coordinators to supervise platforms and to enforce the DSA. The guidelines will apply to all online platforms, including those that are aimed at adults (such as adult entertainment platforms) but that still have underage users due to inadequate or non-existent age-verification tools. The guiding principle of this work is the rights of the child, and the best interest of the child should be a central consideration when designing and deploying online platforms’ products, services and policies. In line with the general approach of the DSA, the Commission proposes that the guidelines take a risk-based approach to online harm. This means that online platforms that are accessible to minors should regularly conduct a child specific impact assessment that is structured around the ‘5C’ typology of risks, namely risks to minors from content, conduct, contact and consumers as well as cross-cutting risks. Identified risks should be addressed in a reasonable, proportionate and effective manner. Measures to keep minors safe will vary from one online platform to another due to their different nature. The guidelines will take into consideration the Commission’s ongoing work in developing a harmonised age-verification solution based on the EU digital identity wallet that will be used to verify whether a user is 18 or older. The Commission invites stakeholders to provide their input, in particular on the points below. 1.Do you have comments or further considerations that should feed into the scope set out above? 2.Please describe any major risks and concerns related to ensuring a high level of privacy, safety and security for minors online. 3.The Commission proposes to apply the 5C typology of risks to develop good practices that platforms should adopt to ensure a high level of privacy, safety and security for minors online. Such good practices should build on assessing how every platform’s features, design, functioning and use might have an impact on the 5C risks. These could include the following factors: ·content moderation systems; ·the design of any algorithmic system; ·account set-up and age-appropriate default settings; ·systems for selecting and presenting advertising; ·commercial practices; ·data management practices; ·age assurance and verification processes; ·product features (e.g. parental controls, helpline support). Do you agree with this approach? What additional factors should be assessed in a child impact assessment? Do you recommend any methodology, metrics, structural indicators and/or thresholds that should be applied and included in a child impact assessment? 4.Please provide good practices or recommendations addressing risks in the 5C typology for the factors listed above. Please refer to any existing documentation, research or resources that could help endorse or validate the good practices proposed. Input to the call for evidence will feed into the Commission’s drafting of the guidelines on the protection of minors online, which will be opened to public consultation throughout 2025. The guidelines will be used to enforce the DSA. |
|
Basis for EU action (legal basis and subsidiarity check) |
|
Legal basis |
|
The legal basis for EU action is set out in Regulation (EU) 2022/2065 (Digital Services Act), in particular Article 28(4). Article 28(1) states that providers of online platforms accessible to minors must put in place appropriate and proportionate measures to ensure a high level of privacy, safety and security of minors. Article 28(4) states that the Commission, after consulting the European Board of Digital Services, may issue guidelines to assist online platform providers in applying Article 28(1). |
|
Practical need for EU action |
|
Action at EU level is needed to ensure the required level of consistency and harmonisation in applying and enforcing Article 28. This is because the Commission, the national Digital Services Coordinators and other national authorities are collectively responsible for supervising, enforcing and monitoring the DSA. |
|
B. What does the initiative aim to achieve and how |
|
This call for evidence aims to support the Commission in drafting guidelines under the DSA and opens the scoping process to the wider community of stakeholders. The call for evidence aims to gather their input on the scope, approach and content of the guidelines. |
|
Likely impacts |
|
The planned guidelines will make the online lives of children and young people safer, and they will ensure a well-functioning, consistent and harmonised enforcement and application of Article 28(1) of the DSA across all EU countries, in full respect of the rights of minors. The initiative will result in a more consistent approach by online platforms in complying with the rules on the protection of minors online. |
|
Future monitoring |
|
The Commission will continue to carry out its tasks in supervising and enforcing the compliance of very large online platforms and search engines with their obligations for protecting minors under Articles 28, 34 and 35, in line with its general powers under the DSA. The guidelines’ effectiveness will be assessed in 2027. The guidelines may also be subject to further regulatory work and improvement based on changes to online risks, online platform practices, technological developments and scientific research. |
|
C. Better regulation |
|
Impact assessment |
|
No impact assessment is planned. |
|
Consultation strategy |
|
Stakeholders are invited to give their views and input through this call for evidence, in particular by responding to the points listed in the section above, ‘Problem the initiative aims to tackle’. In addition, the Commission will organise outreach activities to gather stakeholders’ feedback during the course of 2024. It will also consult the Special group on age-appropriate design and the Safer Internet for Kids expert group. The Commission will also continue coordinating with Member States through the European Board of Digital Services’ recently formed Protection of Minors Working Group of the Board. The Task Force on Age Verification has been integrated into this Working Group. It will also work closely with the European Centre for Algorithmic Transparency and Safer Internet Centres. |
|
Why we are consulting? |
|
This call for evidence will ensure that the Commission takes account of the perspectives of stakeholders, especially young people, by transparently gathering their views, arguments and underlying information and analysis on risks, gaps and measures required at EU level to protect minors online. |
|
Target audience |
|
The consultations are addressed at a wide range of public and private stakeholders, including: ·national authorities, particularly Digital Services Coordinators and other competent authorities (e.g. media regulators), and relevant ministries; ·research organisations and academia; ·international organisations; ·civil society organisations, including children’s rights organisations; ·professionals working in the field of minors’ health (psychologists, psychiatrists, sociologists, etc.); ·all online platforms, including providers of very large online platforms and very large online search engines; ·industry associations;
·children and young people, teachers and parents (or legal representatives).
Any consultation targeting children and young people will be accompanied by child-friendly descriptions and safeguards to their participation. |