EUR-Lex Access to European Union law

Back to EUR-Lex homepage

This document is an excerpt from the EUR-Lex website

Document 52022AE2804

Opinion of the European Economic and Social Committee on the Proposal for a Regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse (COM(2022) 209 final — 2022/0155 (COD)) and on the Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions — A Digital Decade for children and youth: the new European strategy for a better internet for kids (BIK+) (COM(2022) 212 final)

EESC 2022/02804

OJ C 486, 21.12.2022, p. 133–138 (BG, ES, CS, DA, DE, ET, EL, EN, FR, GA, HR, IT, LV, LT, HU, MT, NL, PL, PT, RO, SK, SL, FI, SV)

21.12.2022   

EN

Official Journal of the European Union

C 486/133


Opinion of the European Economic and Social Committee on the Proposal for a Regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse

(COM(2022) 209 final — 2022/0155 (COD))

and on the Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions — A Digital Decade for children and youth: the new European strategy for a better internet for kids (BIK+)

(COM(2022) 212 final)

(2022/C 486/18)

Rapporteur:

Veselin MITOV

Referral

Council of the European Union, 22.7.2022

European Parliament, 12.9.2022

European Commission, 28.6.2022

Legal basis

Article 114 of the Treaty on the Functioning of the European Union

Article 304 of the Treaty on the Functioning of the European Union

Section responsible

Section for the Single Market, Production and Consumption

Adopted in section

8.9.2022

Adopted at plenary

21.9.2022

Plenary session No

572

Outcome of vote

(for/against/abstentions)

233/0/1

1.   Conclusions and recommendations

1.1.

The European Economic and Social Committee (EESC) welcomes the proposal for a regulation laying down rules to prevent and combat child sexual abuse (1) and the strategy entitled A Digital Decade for children and youth: the new European strategy for a better internet for kids (2); these texts are very well timed given that children are using the internet at an earlier age and almost daily and that Europol has noted an increasing demand for child sexual abuse material.

1.2.

It supports the educational dimension of the strategy, as it is essential to strengthen skills, digital literacy and awareness of the use of personal data so that all children, whatever their circumstances, can make informed use of the internet so as to protect themselves from potential dangers.

1.3.

Training of legal guardians and of the competent people in environments such as schools, educational institutes and sports is also crucial, as many adults do not have the necessary skills. The EESC welcomes the Commission's intention to organise media literacy campaigns for children and their legal guardians, through the aforementioned networks and multipliers. It strongly encourages extending them to include other organised civil society organisations, in order to increase their impact and develop creative solutions, as in some Member States they have long-standing grassroots and front-line experience. They should therefore be given financial support for their activities as well.

1.4.

The Committee supports the principle of the proposed regulation, but has reservations as regards the disproportionate nature of the measures envisaged and the danger of the infringement of the presumption of innocence, since it aims to oblige technology companies to scan messages, photos and videos posted online in order to detect possible child abuse and then, in the event of ‘certainty’ and after the fact, to involve a coordination authority appointed by the Member State. That authority would be empowered to ask a national court or an independent administrative authority to issue a detection order.

1.5.

Combating child pornography online is legitimate and necessary, but imposing a private detection system prima facie for a certain type of content, however illicit, illegal and dangerous, poses a risk of widespread monitoring of all virtual exchanges.

1.6.

The proposed regulation stipulates that companies are to detect language models related to child sexual abuse, using artificial intelligence to analyse exchanges where adults engage in grooming. However, it is clear from our digital lives that algorithmic scans are not infallible. It is vitally important that they be used cautiously and in accordance with a set of rules.

1.7.

The EESC's aim is to safeguard the interests of all, including secrecy of correspondence and respect for privacy, which are constitutional requirements (3). However, a general sweep of hosting and communication services poses a specific risk to end-to-end encryption of online exchanges. It asks the Commission to make the text better and more specific in order to safeguard secrecy of correspondence and respect for privacy.

1.8.

The EESC supports the creation of a new European agency, whose responsibilities include two key areas: an operational hub and a research and analysis hub; due to its international dimension, the fight against child pornography and paedophilia online calls for coordination of operational activities and analyses.

1.9.

The EESC would welcome Eurojust's involvement in the structure envisaged by the Commission, as coordinated investigations mean coordinated judicial inquiries.

2.   Overview of the Commission's planned measures

2.1.

On 11 May 2022, the Commission presented a proposal for a regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse (4), and a strategy entitled A Digital Decade for children and young people: the new European Strategy for a better internet for kids (5).

2.2.

The package is based on the European Parliament resolution of 26 November 2019 on children's rights on the occasion of the 30th anniversary of the UN Convention on the Rights of the Child (6), the Council conclusions on media literacy and the Council Recommendation of 14 June 2021 establishing a European Child Guarantee (7).

The strategy set out in the communication

2.3.

The 2012 European Strategy for a Better Internet for Children played a central role in the online protection and empowerment of children, partly thanks to the network of Safer Internet Centres and the betterinternetforkids.eu portal. However, it has since become obsolete as children now begin using smartphones and computers at an earlier age and use them more often. They are also increasingly dependent on them for school or leisure activities.

2.4.

The COVID-19 pandemic and the accompanying lockdown have highlighted the challenges of digital training for children, teachers and educators on the potential dangers of the internet. According to Europol, demand for child sexual abuse material has increased by 25 % in some Member States. Reports of children targeted by sexual grooming increased by more than 16 % between 2020 and 2021.

2.5.

The strategy proposed by the Commission in May 2022 is based on three pillars:

protecting children from illegal and dangerous online content and improving their well-being online;

digital empowerment, so that children acquire the skills needed to navigate online and express themselves in a safe and responsible manner;

active participation of children by giving them the right to have their say and to more innovative activities to improve their online experience.

2.6.

It builds on the wide-ranging consultation #DigitalDecade4YOUth (8) organised from March to August 2021 by European Schoolnet, with the support of the Insafe network of European Safer Internet Centres, complemented by broad consultations, and is based on the right of children to be heard in any decision-making process that affects them (9).

2.7.

It was complemented by a MOOC for teachers in April and May 2021 entitled Better Internet for Kids on the topic of ‘Digital literacy and online safety: how the pandemic tested our skills’.

2.8.

In addition, EU citizens (parents, teachers, educators, etc.) had the opportunity to respond to an online survey based on the same questions as the #DigitalDecade4YOUth consultation.

2.9.

The conclusions of the 2020 survey EU Kids Online (10) show that the majority of children use their digital device almost daily, begin to do so at an earlier age and spend more and more time on them.

2.10.

The COVID-19 pandemic and the lockdown have highlighted the challenges of digital education for children and adult legal guardians (see the Digital Education Action Plan 2021-2027 (11)).

2.11.

Indeed, the information collected shows that children are frequently exposed to harmful, illicit or even illegal content, behaviour and contacts. The use of social media or interactive games entails risks such as exposure to unadapted content, harassment, grooming and sexual abuse.

2.12.

According to Europol (12), in the first months of the COVID-19 crisis demand for child sexual abuse material increased by up to 25 % in some Member States. The National Centre for Missing and Exploited Children in the United States received almost 30 million reports of suspected sexual exploitation of children in 2021, and law enforcement officers were alerted to more than 4 000 new child victims. Reports of children subject to grooming increased by more than 16 % between 2020 and 2021. Children with disabilities are especially targeted: up to 68 % of girls and 30 % of boys with a mental or physical impairment will be victims of sexual violence before their 18th birthdays (13).

2.13.

However, in existing EU legislation (the AVMSD and GDPR), the mechanisms for age verification and parental consent often lack effectiveness, as users are usually only required to indicate their date of birth when creating an online profile.

2.14.

The proposed regulation imposes an obligation on providers of online hosting or communication services to detect, report and remove any online material relating to child sexual abuse.

2.15.

It also provides for the creation of a European agency to prevent and combat child sexual abuse, to facilitate the detection, reporting and removal of child sexual abuse content online, to provide support to victims and to establish a centre of knowledge, expertise and research on preventing and combating child sexual abuse.

General comments on the proposed regulation

2.16.

It is based on evaluation and risk mitigation obligations incumbent on internet hosting and interpersonal communication service providers, before a detection order can be issued by a national court or an independent administrative authority appointed by the Member State.

2.17.

The EESC supports the principle of the initiative which complements and makes existing measures more effective through penalties for internet hosting and interpersonal communication service providers, making them responsible for tracking prima facie child abuse photos and videos.

2.18.

However, it has reservations regarding the risks to privacy and encryption of conversations. The potential surveillance of online exchanges by private operators and the potential for unfounded accusations could undermine the presumption of innocence.

3.   Specific comments

The educational component of the strategy

3.1.

Educating children and their legal guardians about the use of social media and other digital tools is fundamental, as children often use digital products and services designed for adults, where targeted marketing techniques and algorithms can encourage them to open content aiming to take advantage of their naivety and lack of knowledge of digital tools, or even lead to communication with dangerous people hidden behind gaming applications or other tools used by children.

3.2.

Often, neither children nor parents realise the extent of the personal data they share on social media. Digital skills and literacy, and awareness of the use of personal data, are essential for children to make informed use of the virtual world.

3.3.

Parents, educators, teachers, the responsible adults at clubs, leisure facilities, etc. also need these skills to be able guide children.

3.4.

The EESC considers this educational aspect to be important in order to protect children in their digital lives and empower them in the virtual world.

3.5.

After all, many teachers, parents and educators do not have the requisite skills, and it is difficult for them to keep abreast of technological developments.

These training courses should also include a module on children's rights online as children's rights are identical both on- and offline.

3.6.

This aspect of the strategy must be based on close cooperation at European and international level, on strengthening work with organised civil society, and above all with schools.

It is essential that national curricula include practical and compulsory courses on online navigation and its risks, while being inclusive and respectful of diversity in general and of accessibility in particular.

3.7.

The EESC welcomes the Commission's intention to organise media literacy campaigns for children and their legal guardians, through the aforementioned networks and multipliers. It strongly encourages extending them to include other organised civil society organisations, in order to increase their impact and develop creative solutions, as in some Member States they have long-standing grassroots and front-line experience. The EESC considers that education plays a key role here: it is the other side of the coin of prevention of child sexual abuse.

3.8.

The EESC agrees with the Commission that children must be encouraged to take part in strategic debates which concern them by granting them the right to meet and to associate on online social platforms and involving them in the process of shaping the digital strategy. It therefore welcomes the creation of the new European platform promoting the participation of children and calls for children to be listened to and not merely heard.

4.   The penalties laid down in the proposed regulation

4.1.

Until now, internet hosting and interpersonal communication service providers have detected illegal content on a voluntary basis. Implicitly regretting the failure of self-regulation, the Commission proposes to force them to act, failing which they will be penalised following an investigation by national authorities. In practice, this means that suppliers and hosts will have to scan all traffic passing through their servers prima facie.

4.2.

The EESC understands that the Commission's intention is not to deprive people of the ability to keep their correspondence confidential, but is concerned that the potentially inappropriate use of intrusive technologies could undermine the protection of privacy if they are not well designed and subject to suitable rules. The goal is twofold: use technology to prevent child sexual abuse and avoid widespread surveillance of correspondence by private operators using algorithms.

4.3.

The EESC considers that rules must be established regarding the development, testing and use of algorithms by encouraging or even requiring stakeholders to shape relevant and effective algorithm governance in order to ensure that their tools work properly. It feels that explainability methods must provide a better understanding of the tools with a view to highlighting bias and dysfunctions and thus correcting problems before users pay the price for them.

4.4.

The EESC notes that the proposal provides for coordinated investigations under the supervision of the national courts, but urges the Commission to improve the proposal to safeguard individual freedoms.

4.5.

The EESC stresses the importance of a balanced approach, since the system envisaged consists of analysing messages, photos or videos to detect possible child abuse and, in the event of ‘certainty’, alerting the competent authorities. It is necessary for the alert mechanism to take account of the necessity, effectiveness, proportionality and balance of the envisaged measures.

4.6.

The Committee recalls that in the Schrems I case (14), the Court of Justice ruled that legislation allowing public authorities to have generalised access to the content of communications undermines the essence of the right to privacy guaranteed by the Charter of Fundamental Rights. The digitisation of content recorded on a server involving an analysis of all communications made via the server therefore raises questions.

4.7.

The responsibility for tracing will lie with platforms and/or social media, which may therefore be requested to track child abuse materials, failing which they will have to pay fines amounting to 6 % of their worldwide turnover. They should use artificial intelligence and detect language models so that exchanges where adults engage in grooming can be blocked. Proper use of algorithmic scanners is therefore essential to avoid errors that give rise to unfounded accusations, as they are not infallible.

4.8.

The Commission also plans to oblige platforms to assess the risk of their services being used to disseminate child sexual abuse images or for grooming, and to promote a ‘comprehensive EU code of conduct on age-appropriate design’ (15), with general conditions that leave the EESC perplexed.

4.9.

It specifies that Member States will have to designate an independent authority to monitor the fulfilment of obligations by platforms — an authority that will be empowered to request, where appropriate, that a national court or independent administrative authority issue a detection order, limited in time, targeting a specific type of content on a given service, and require the company concerned to search for any content relating to child sexual abuse or grooming.

4.10.

If these independent authorities considered that a service was too risky for children, they could ask internet hosting and interpersonal communication service providers to scan their content and exchanges for a specific period of time. The EESC would prefer the application of this mechanism to be placed under the effective prior control of a national court, as the guardian of individual freedoms, and considers that the compatibility of this type of order with the Charter of Fundamental Rights is open to question.

4.11.

If the proposal were adopted as it stands, the regulation would force technology companies to monitor their platforms to use algorithms to detect child abuse.

4.12.

While the aim is commendable, the EESC believes there is a risk of undermining respect for online private correspondence, the right to privacy and the protection of personal data, and that every effort must be made to avoid this.

4.13.

The aim of the EESC is to safeguard the interests of all people, as some of them align with constitutional requirements, such as secrecy of correspondence and respect for privacy (16).

4.14.

It stresses that the prospect of an obligation to carry out a general sweep of hosting and communication services poses a risk to all technologies that preserve the secrecy of correspondence, starting with end-to-end encryption.

4.15.

Indeed, the Commission acknowledges that finding exchanges where adults engage in grooming is ‘generally speaking the most intrusive one for users’ since it requires ‘automatically scanning through texts in interpersonal communications’.

5.   The creation of a new European agency

5.1.

The proposed regulation provides for the creation of an independent European agency with a budget of EUR 26 million, based alongside Europol in The Hague, which will be responsible for analysing reports of illegal material, coordinating databases of fingerprints and illegal material and helping companies to identify reliable technologies.

It would also act as an intermediary between technology companies, law enforcement authorities and victims.

5.2.

The EESC welcomes the fact that the agency's competences would be organised in two key areas: an operational hub and a research and analysis hub, as the combating of child pornography and paedophilia online calls for coordination of operations and analyses.

5.3.

The operational aspect is fundamental and justifies close cooperation with Europol, whose effectiveness is no longer in question. The European and international scale of online sexual crime against children truly warrants its creation.

5.4.

The EESC would welcome Eurojust's operational involvement in the structure envisaged by the Commission, as coordinated investigations mean coordinated judicial inquiries.

Brussels, 21 September 2022.

The President of the European Economic and Social Committee

Christa SCHWENG


(1)  COM(2022) 209 final.

(2)  COM(2022) 212 final.

(3)  La constitutionnalisation du droit au respect de la vie privée (available in French only).

(4)  COM(2022) 209 final — 2022/0155 (COD).

(5)  COM(2022) 212 final.

(6)  OJ C 232, 16.6.2021, p. 2.

(7)  OJ L 223, 22.6.2021, p. 14.

(8)  https://europa.eu/!XXv6kx

(9)  Article 12 of the United Nations Convention on the Rights of the Child.

(10)  EU Kids Online.

(11)  COM(2020) 624 final.

(12)  https://europa.eu/!Jh78ux

(13)  Children with disabilities.

(14)  Case C-362/14, paragraph 94.

(15)  COM(2022) 212 final.

(16)  La consitutionnalisation du droit au respect de la vie privée (available in French only).


Top