EUROPEAN COMMISSION
Brussels, 11.5.2022
SWD(2022) 209 final
COMMISSION STAFF WORKING DOCUMENT
IMPACT ASSESSMENT REPORT
Accompanying the document
Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL
laying down rules to prevent and combat child sexual abuse
{COM(2022) 209 final} - {SEC(2022) 209 final} - {SWD(2022) 210 final}
Contents
1.Introduction: political and legal context4
2.Problem definition
2.1What is the problem?7
2.2What are the problem drivers?5
2.3How likely is the problem to persist?8
3.Why should the EU act?0
3.1Legal basis0
3.2Subsidiarity: necessity of EU action1
3.3Subsidiarity: added value of EU action1
4.Objectives: What is to be achieved?3
4.1General objective3
4.2Specific objectives3
5.What are the available policy options?4
5.1What is the baseline from which options are assessed?4
5.2Description of the policy options1
5.3Measures discarded at an early stage3
6.What are the impacts of the policy options?
6.1Qualitative assessment
6.2Quantitative assessment0
7.How do the options compare?4
7.1Qualitative comparison4
7.2Quantitative comparison1
8.Preferred option2
8.1Main advantages3
8.2Main disadvantages4
8.3Trade-Offs4
8.4Application of the ‘one in, one out’ approach5
9.How will actual impacts be monitored and evaluated?5
ANNEXES………………………………………………………………………………………118
Term/Acronym
|
Definition
|
AI
|
Artificial Intelligence
|
API
|
Application Programming Interfaces
|
Classifiers
|
A form of artificial intelligence, an algorithm that sorts data into labelled classes or categories
|
CSA
|
Child Sexual Abuse
|
CSA online
|
CSA content refers to text-based exchanges, photos, videos and other material illegal under EU law (CSA Directive). In this document it refers to the three main types of abuse: known CSAM, new CSAM and grooming
|
CSA Directive
|
Directive 2011/93/EU of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography
|
CSAM
|
Child Sexual Abuse Material, e.g. images and videos
|
CSEA
|
Child Sexual Exploitation and Abuse
|
Darkweb
|
Websites not indexed by conventional search engines, making use of masked IP addresses, which are only accessible with a special web browser
|
DSA
|
Digital Services Act Proposal for a Regulation on a Single Market for Digital Services and amending Directive 2000/31/EC, COM(2020) 825 final
|
E2EE
|
End-to-end Encryption
|
EECC
|
Directive 2018/1972/EU of 11 December 2018 establishing the European Electronic Communications Code
|
E-evidence
|
Electronic evidence: electronically stored data such as subscriber information, metadata or content data
|
Encryption
|
Process of changing electronic information or signals into a secret code or cipher
|
Grooming
|
Offenders building trust and a relationship with a child in an effort to gain access to the minor for sexual exploitation or abuse. Also known as solicitation
|
Hash
|
A unique digital code created by a mathematical algorithm (“hashing”) that becomes this file’s signature, or its hash value
|
Hotline
|
Child sexual abuse hotlines deal with questions about or reports of child sexual abuse. They can report content to law enforcement, take action for CSAM to be removed from the internet and act as interest groups
|
IP address
|
Internet Protocol address: a unique identifier allowing a device to send and receive packets of information; a basis for connecting to the Internet
|
ISCO
|
International Standard Classification of Occupations
|
Malware
|
Any type of software designed to disrupt the normal functioning of a computer, server, or computer network
|
NCMEC
|
National Centre for Missing and Exploited Children (US private, non-profit organisation) to which online service providers are required to report under US law instances of potential child sexual abuse that they find in their networks
|
OTTs
|
Over-the-Top communications services enable direct interpersonal and interactive exchange of information via electronic communications (i.e. the Internet), without connecting to the public telephone network
|
P2P
|
Peer-to-peer sharing describes networks in which each computer can act as a server, allowing files to be shared directly without the need for a central server
|
PhotoDNA
|
The most widely used tool based on hashing technology, available free of charge, based on a licensing agreement tailored to avoid abuse and use for any other purpose than the detection of CSA
|
Safety-by-design
|
The embedding of the rights and safety of users into the design and functionality of online products and services from the outset
|
SDGs
|
Sustainable Development Goals, a set of 17 interlinked goals established by the UN in 2015 as "a blueprint to achieve a better and more sustainable future for all people and the world by 2030"
|
SMEs
|
Enterprises that do not exceed a staff headcount of 250 people, a turnover of EUR 50M and an annual balance sheet total of EUR 43M
|
Trusted flagger program
|
A program under which an organisation designates certain persons or organisations whose reports of online CSA are trusted to meet sufficiently high standards, and may be treated differently, for example by being given higher priority for review
|
URL
|
Uniform Resource Locator, i.e. the address of an internet object (e.g. an image, a video, or an entire website)
|
1.Introduction: political and legal context
Children face a number of risks in their daily lives, both online and offline, from which they cannot fully protect themselves. One of these risks is that of being sexually abused during childhood. The initiative assessed here aims to complement the existing EU framework by defining the responsibilities of certain online service providers to protect children against sexual abuse. In the absence of harmonised rules at EU level, providers of social media platforms, gaming services, and other hosting and online communications services find themselves faced with divergent rules across the internal market. The proliferation of rules is increasing, with recent legislative changes in the Netherlands and Germany, and at the same time there is evidence that current efforts at national level are insufficient to successfully address the underlying problem.
Children have the fundamental right to such protection and care as is necessary for their well-being, and their best interests must be a primary consideration in all actions relating to them. Consequently, the fight against child sexual abuse (CSA) is a priority for the EU. In the July 2020 EU strategy for a more effective fight against child sexual abuse, the Commission set out eight concrete actions, implementing and developing the right legal framework and catalysing multi-stakeholder efforts in relation to prevention and investigation of these crimes and assistance to victims and survivors.
The legislative proposal that this impact assessment accompanies responds to the commitment undertaken in the strategy to propose the necessary legislation to tackle child sexual abuse effectively, online and offline. In particular, this initiative:
1.sets out obligations to detect, report and remove child sexual abuse online to bring more clarity and certainty to the work of both law enforcement and relevant actors in the private sector to tackle online abuse; and
2.establishes an EU Centre to prevent and counter child sexual abuse to provide comprehensive support for the implementation of the proposed Regulation by service providers and to Member States, in the fight against child sexual abuse.
The commitment and this initiative respond to the calls for action from the Council, the European Parliament, and the European Economic and Social Committee, and globally in multiple forums, including by online service providers and in the media, as it has become evident that current measures are falling short of effectively protecting the right of children to live free from sexual violence. This initiative is therefore expected, as the need to better prevent and combat child sexual abuse through additional legislation was already clear during the preparation of the 2020 strategy, and also during the inter-institutional negotiations of the Interim Regulation (see below).
The initiative aims to build on and complement the existing policy instruments in the fight against CSA, which can be grouped into legislation, coordination and funding.
1.Legislation
The existing legal framework consists of measures in the areas of criminal law, protection of privacy and personal data, and the internal market, regulating online and telecommunications services and content moderation. It includes:
·horizontal instruments in the area of data protection and online privacy (e.g. GDPR
and e-Privacy Directive
and its proposed revision), and of the single market for digital services (e.g. e-Commerce Directive
and the proposed Digital Services Act
),
·sector-specific legislation, such as the Child Sexual Abuse Directive
, the Europol Regulation
and its proposed revision, the Interim Regulation derogating from the application of certain rights and obligations under the ePrivacy Directive, and the Victims’ Rights Directive
.
Horizontal instruments
The General Data Protection Regulation (GDPR)
·What it does: the GDPR sets out rules on the processing of personal data relating to individuals, specifying the fundamental right to protection of personal data.
·How CSA-related responsibilities are distributed between EU and Member States: as a horizontal instrument, the GDPR does not contain CSA-specific provisions, but it applies to all activities of processing personal data, including those related to CSA, except for those carried out by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, which are covered by Directive 2016/680/EU. Member States are notably responsible for enforcement through their data protection authorities, and the European Data Protection Board (EDPB) is tasked with the consistent application of the GDPR.
·How the proposed legislation builds on and interacts with the GDPR: the proposed legislation builds on the GDPR, including its Article 6 which allows, e.g., processing of personal data to comply with a legal obligation (Art. 6(1)(c)), or when processing is necessary for the purpose of legitimate interest (Art. 6(1)(f)).
The ePrivacy Directive and its proposed revision
·What it does: the ePrivacy Directive and the proposed Regulation for its revision harmonise national rules to ensure an equivalent level of protection of fundamental rights and freedoms, and in particular the right to privacy and confidentiality of communications, with respect to the processing of personal data in electronic communications services. These ePrivacy rules particularise and complement the GDPR.
·How CSA-related responsibilities are distributed between EU and Member States: as horizontal instruments, the ePrivacy Directive and the proposed successor Regulation do not contain CSA-specific provisions; they apply to any processing of specified data categories in electronic communications. Member States are responsible for enforcement through their competent national authorities.
·How the proposed legislation builds on and interacts with the ePrivacy Directive and its proposed revision: the proposed legislation would limit the scope of certain rights and obligations which are currently in the ePrivacy Directive, notably those on the confidentiality of communications and related data in order to enable companies to identify child sexual abuse taking place on their systems after the issuance of a detection order, subject to strict safeguards.
The eCommerce Directive
·What it does: the eCommerce Directive sets out a framework for the provision of information society services in the internal market. One of its key principles is a conditional liability exemption framework for providers of specific categories of information society services. In principle, providers may not be held liable for information (including illegal content) that they host (store), cache (temporarily store) or transmit during the provision of their services, subject to the conditions laid down in the Directive. For example, this means that providers of hosting services may not be held liable for information they host, unless they gain actual knowledge or awareness of the illegality and fail to act expeditiously. The Directive also prohibits Member States from imposing general obligations to monitor their services or to actively seek facts or circumstances indicating illegal activity. The eCommerce Directive does not establish a legal basis for any processing of personal data.
·How CSA-related responsibilities are distributed between EU and Member States: as a horizontal instrument, the eCommerce Directive does not contain CSA-specific provisions. It governs activities of relevant service providers. Member States are responsible for enforcement through their national authorities.
·How the proposed legislation builds on and interacts with the eCommerce Directive: the proposed legislation imposes narrowly targeted obligations to detect, report and remove child sexual abuse online, based on specific indicators and requirements to ensure compatibility with the eCommerce Directive (see box 9).
The Digital Services Act proposal
·What it does: the Digital Services Act (DSA) proposal, if adopted as proposed, and building upon the eCommerce Directive’s framework, would provide a horizontal standard for content moderation by providers of intermediary services. It would remove a number of disincentives for providers’ voluntary efforts to detect, remove or disable access to illegal content (including child sexual abuse material, CSAM) and would create obligations for them to provide information on their content moderation efforts when requested by national authorities. The DSA would also create additional due diligence obligations tailored to specific categories of providers of intermediary services (e.g. hosting services, online platforms, very large online platforms) as well as transparency reporting obligations. For instance, it would require hosting services to put in place notice and action mechanisms enabling any user or entity to notify them of the presence of suspected illegal content. Furthermore, the DSA would oblige very large online platforms to implement risk mitigation measures on their services. The DSA would also establish rules on its implementation and enforcement, including as regards the cooperation of and coordination between the competent authorities. The DSA would not establish a legal basis for any processing of personal data.
·How CSA-related responsibilities are distributed between EU and Member States: as a horizontal instrument covering all types of illegal content, the DSA does not contain CSA-specific provisions. The DSA would create a framework at EU level for the notification of materials noticed by users to companies, with obligations for companies to respond to orders issued by public authorities in Member States, as well as additional due diligence requirements for very large platforms. For the very large platforms, a stronger role for the Commission in the enforcement process is also being considered during the ongoing inter-institutional negotiations at the time of writing.
·How the proposed legislation builds on and interacts with the DSA as proposed: the proposed legislation complements the DSA notably by specifying mandatory removal of CSAM when ordered and a comprehensive reporting obligation tailored to the specificities of CSA online, which often takes place hidden from public view and demands specific follow-up where identified. These specificities require a different approach from the horizontal one of the DSA. Finally, as the DSA aims to maintain some of the main principles of the eCommerce Directive, including the prohibition of general monitoring obligation and the unavailability of the liability exemption for hosting services if failing to act after obtaining actual knowledge or aware of the illegality of the content, the considerations above made for the eCommerce Directive also apply to the DSA.
The Victims’ Rights Directive
·What it does: the Victims’ Rights Directive establishes minimum standards on the rights of, support for and protection of victims of crime and ensures that they are recognised and treated with respect. They must also be granted access to justice.
·How CSA-related responsibilities are distributed between the EU and Member States: as a horizontal instrument, the Victims’ Rights Directive, applicable to all victims of crime, does not contain CSA-specific provisions. The EU adopted specific rules for victims of child sexual abuse and sexual exploitation under the Child Sexual Abuse Directive (see below), to respond more directly to the specific needs of those victims.
·How the proposed legislation builds on and interacts with the Victims’ Rights Directive: whilst the proposed legislation focuses on strengthening the functioning of the internal market by setting common rules aimed at preventing and combating the misuse of online services for CSA-related purposes, it could also help support and facilitate the work of Member States on assistance to victims of CSA, notably through the creation of the EU Centre to prevent and counter CSA, which would facilitate research and the exchange of best practices among Member States. The proposed legislation does not create new obligations for Member States in this respect.
Sector-specific legislation
The Child Sexual Abuse Directive
·What it does: the Child Sexual Abuse (CSA) Directive’s main objective is to harmonise minimum criminal law rules at EU level concerning the definitions of child sexual abuse and exploitation offences and corresponding sanctions and to require the establishment of prevention measures in this area. It also requires Member States to ensure the provision of assistance and support to victims before, during and after the conclusion of criminal proceedings. In terms of websites disseminating CSAM, the Directive requires Member States to take necessary measures to ensure the prompt removal of webpages hosted in their territory and to endeavour to obtain the removal of such pages hosted outside their territory. It also enables Member States to take voluntary measures to block access to web pages containing or disseminating CSAM within their territory, while providing safeguards (restriction is limited to what is necessary and proportionate; users are informed of the reason for the restriction and of the possibility of judicial redress). The Child Sexual Abuse Directive does not establish a legal basis for any processing of personal data.
·How CSA-related responsibilities are distributed between EU and Member States: the Directive defines a minimum set of standards at EU level to define and sanction these crimes, prevent them and assist victims. Member States are required to comply with these minimum rules and may go beyond them if they consider it necessary. Similarly, the Directive defines the responsibilities of Member States but leaves to national authorities to comply with those responsibilities in the way that suits best the national specificities (e.g. on prevention programmes).
·How the proposed legislation builds on and interacts with the Child Sexual Abuse Directive: the former is intended to reinforce and complement the latter without creating unnecessary overlaps. Whereas the Directive focuses on defining the roles and responsibilities of Member States’ authorities in the fight against CSA using the tools of criminal law, the proposed legislation focuses, from an internal market angle, on defining the roles and responsibilities of private companies offering their services in the Single Market, notably concerning the detection, reporting and removal of CSA online. Nonetheless, the proposed legislation could help support and facilitate the efforts by Member States to meet the obligations defined in the CSA Directive relating to prevention and assistance to victims, notably through the creation of the EU Centre to prevent and combat CSA.
The proposed initiative cannot address remaining implementation issues with the Directive. A study has been launched to prepare the evaluation of the CSA Directive and at the moment there are ongoing infringement procedures against 21 Member States. The majority of the challenges Member States face in the implementation concern offline prevention measures (in particular prevention programmes for offenders and for people who fear that they might offend) and criminal law definitions. Exchanges between the Commission and Member States are ongoing to ensure that they swiftly address these remaining issues. The Commission has also organised dedicated expert workshops with Member States to facilitate the exchange of lessons learned and of best practices in national experiences in the implementation of the CSA Directive. That said, the present legislative initiative could indirectly have a positive effect on the implementation of the Directive, in particular through the EU Centre as an expert hub and facilitator of exchanges of knowledge and best practices.
The “Interim Regulation”
·What it does: voluntary detection of CSAM and grooming in certain online communication services like instant messenger and email has been made subject, as of 21 December 2020, to comply with the ePrivacy Directive’s rules on confidentiality of communications, due to changes in the definitions of the European Electronic Communications Code becoming effective and those services consequently fell under the ePrivacy Directive. To address this issue, the Commission proposed a temporary derogation from the application of certain rights and obligations under the ePrivacy Directive, for the sole purpose of detecting and reporting CSA and removing CSAM. The Interim Regulation, which entered into force on 2 August 2021, enables those services to continue such practices on a voluntary basis, provided those practices are lawful and, in particular, meet a range of conditions. The Regulation ceases to apply three years after its entry into force. The Interim Regulation does not establish a legal basis for any processing of personal data.
·How CSA-related responsibilities are distributed between EU and Member States: the Commission is responsible for making a list of names and organisations acting in the public interest against CSA to which providers report CSA online, for requesting the European Data Protection Board (EDPB) to issue guidelines for the purpose of assisting the supervisory authorities in assessing whether processing falling within the scope of the Regulation complies with the GDPR, and for preparing a report on the implementation of the Regulation. Member States are notably responsible for enforcing the Regulation and for statistics related to the detection, reporting and follow up of the CSA reports.
·How the proposed legislation builds on and interacts with the Interim Regulation: the proposed legislation replaces the Interim Regulation, and uses it as a reference to present a long-term framework that maintains some of its elements and covers a wider range of services, including private communications.
The Europol Regulation and its proposed revision
·What it does: the Europol Regulation sets out the mandate of the European Union’s law enforcement agency, which is to support and strengthen action by competent authorities of the Member States and their mutual cooperation including in preventing and combating serious forms of crime, such as sexual abuse and sexual exploitation. Among other tasks, Europol’s current mandate allows the agency to collect, store, process, analyse and exchange information, including criminal intelligence; to notify the Member States of any information and connections between criminal offences concerning them and to coordinate, organise and implement investigative and operational actions to support and strengthen actions by the competent authorities of the Member States. The proposed revision of Europol’s mandate would notably allow it to receive data from private parties directly, subject to certain conditions.
·How CSA-related responsibilities are distributed between EU and Member States. Europol can support Member States' actions in preventing and combating CSA crimes. In particular, Europol receives reports from online service providers via the US National Centre for Missing and Exploited Children (NCMEC) for 19 Member States, completes these reports with its own information (if any) and forwards them to the Member States’ authorities.
·How the proposed legislation builds on and interacts with the Europol Regulation and its proposed revision. The proposed legislation creates an EU Centre to prevent and counter CSA, which will work closely with Europol. The Centre will receive the reports from online service providers, check that they are likely to be actionable, i.e. they are not manifestly unfounded and can thus in principle be acted upon, and forward them to Europol so that it can enrich the reports with additional criminal intelligence, as well as to national law enforcement agencies. This would ensure that Europol and national law enforcement resources are focused on key investigative tasks such as swiftly rescuing victims from ongoing abuse, rather than on e.g. filtering out the reports that are not relevant. The revised Europol mandate would complement the proposed legislation in particular on the ability for Europol to receive and process reports from the EU Centre originating from online service providers.
2.Coordination
The existing legal framework is complemented by practical efforts at EU level to step up the fight against CSA in all areas: investigations, prevention, and assistance to victims.
EU level cooperation in investigations
·What it does: Europol provides EU level coordination for investigation of cross-border cases. In addition, the EU policy cycle (EMPACT) serves to coordinate the operational priorities of Member States’ law enforcement authorities in the area of combating CSA, to organise joint operations and strategic approaches to specific phenomena from a law enforcement perspective. Europol also helps coordinate investigations involving law enforcement agencies in third countries and in the Member States.
·How CSA-related responsibilities are distributed between EU and Member States: Europol supports operational action by law enforcement agencies in Member States at their request. Europol does not have executive powers (i.e. it is not a “European FBI”).
·How the proposed legislation builds on and interacts with existing EU level cooperation in investigations: the proposed legislation aims to support the existing cooperation in investigations by ensuring that the reports from online service providers that reach Europol and national law enforcement agencies are actionable and relevant. The EU Centre would not have any operational capability on investigations, but would support them indirectly by facilitating the process of detection, reporting and removal of CSA online by service providers.
EU level cooperation in prevention
·What it does: at the moment, EU level cooperation in prevention of CSA is fragmented and limited to ad hoc expert meetings organised by the Commission to support Member States in the implementation of the CSA Directive, initiatives on awareness raising under EMPACT and Europol. The 2020 CSA Strategy aimed to boost EU level efforts on prevention by making it one of its pillars. Specifically, the Strategy included the EU Centre to prevent and counter CSA, which will also carry out certain tasks relating to prevention. The Strategy also announced the launch of a prevention network of practitioners and researchers to support the EU Member States in putting in place usable, rigorously evaluated and effective prevention measures to decrease prevalence of child sexual abuse in the EU. The network will aim to give structure and regularity to exchanges of knowledge and best practices between Member States.
·How CSA-related responsibilities are distributed between EU and Member States. The CSA Directive requires Member States to implement provisions while leaving it to them to determine exactly what these measures or programmes are. The degree to which the requirements of the Directive are fulfilled vary among the Member States (see section 2.2.3.).
·How the proposed legislation builds on and interacts with existing EU level cooperation in prevention. The proposed legislation will establish the EU Centre, which will be the driving force of the work relating to preventing and combating CSA at EU level. Whilst the Centre would principally focus on its tasks set out in the envisaged legislation connected to the common rules for online service providers to combat CSA online, the Centre could also contribute to and facilitate Member States’ work relating to prevention, for instance through the involvement of multiple stakeholders and the sharing of best practices and lessons learned across Member States. The proposed legislation will not create new obligations for Member States on prevention.
EU level cooperation in assistance to victims
·What it does: EU level cooperation in assistance to victims takes place currently through the Victims’ Rights Platform, which deals with horizontal issues relevant for victims’ rights. The platform brings together representatives of EU level networks, agencies, bodies and civil society organisations relevant for the implementation of the EU Strategy on victims’ rights.
·How CSA-related responsibilities are distributed between EU and Member States: the platform facilitates the implementation of the EU strategy on victims’ rights, which details key actions for the European Commission and for Member States. Also, the CSA Directive requires Member States to implement provisions related to assistance to victims, while leaving it to them to determine exactly what these measures are. The degree to which the requirements of the Directive are fulfilled varies among the Member States (see section 2.2.3.).
·How the proposed legislation builds on and interacts with existing EU level cooperation in assistance to victims: apart from its main tasks in the process of combating CSA online, the EU Centre could also facilitate and support Member States action in assistance to victims of CSA, specifically by serving as a hub of expertise to support evidence-based policy development, help develop research on assistance to victims, including victims’ needs and the effectiveness of short-term and long-term assistance programmes. The Centre will also support victims, at their request, in having their images and videos taken down by assisting them in exchanges with the relevant online service providers. The EU Centre could participate in the Victims’ Rights Platform to contribute to the discussion of horizontal issues concerning victims and to the implementation of the EU strategy on victims’ rights. The proposed legislation will not create new obligations for Member States on assistance to victims.
Multi-stakeholder cooperation at EU and global level
·What it does: at EU level, the Commission facilitates multi-stakeholder cooperation between service providers and national authorities in the fight against CSA online through the EU Internet Forum, which brings together online service providers and ministers of interior of all Member States.
At global level, the Commission continues to contribute to increasing voluntary standards for the protection of children against sexual abuse by promoting multi-stakeholder cooperation, through the WeProtect Global Alliance to End Child Sexual Exploitation Online (WPGA)
.
·How CSA-related responsibilities are distributed between EU and Member States: at EU level, the Commission organises the EU Internet Forum, in which Member States participate at ministerial level (once a year), and at various levels in the technical discussions. Depending on the initiative, Member States and/or the Commission may be responsible for the execution.
At global level, the Commission participates in the policy board of the WPGA, as one of its founding members. Member States are WPGA members and notably participate in its biannual global summit (the next one will take place in Brussels in June 2022 and will be co-hosted by the Commission and the French Presidency of the Council of the EU).
·How the proposed legislation builds on and interacts with existing multi-stakeholder cooperation at EU and global level: the proposed legislation builds on the experience of the EU Internet Forum and the WPGA and aims to boost multi-stakeholder cooperation in the EU and globally in the fight against CSA, through the EU Centre. The Centre will be an independent facilitator that will bring together all the relevant actors in the EU and beyond in any aspect of the fight against CSA, including investigations, prevention and assistance to victims, to ultimately facilitate and support Member States’ action in those areas. The Centre will have a more operational focus than the EU Internet Forum and the WPGA, which are centred on policy and are not designed to play a role in facilitating day-to-day efforts on the ground.
3.Funding
·What it does: the 2020 strategy includes a commitment to continue providing funding for fighting child sexual abuse, e.g. to support the development of national capacities to keep up with technological developments. The Commission has organised regular calls for project proposals to fight the online and offline aspects of child sexual abuse, with a total value of 61 million euro in the last 10 years (funded under Horizon2020 and Internal Security Fund). Notable examples of EU-funded projects include:
oThe INHOPE network of hotlines, where users can report child sexual abuse materials they encounter online (formerly funded through the Connecting Europe Facility programme, and currently under the DIGITAL Europe programme). The content is analysed, and if assessed as illegal, hotlines notify the relevant online service providers requesting the swift removal of the content, and report the case to the relevant law enforcement agency for victim identification purposes. National hotlines are an important element of implementation of Article 25 of the CSA Directive, as a majority of Member States has chosen to implement most of this article through the hotlines. As of January 2022, the INHOPE network consists of 46 hotlines in 42 countries (including all Member States except Slovakia);
oThe
International Child Sexual Exploitation (ICSE)
database at Interpol, which is an important tool enabling law enforcement to identify victims globally. The database has helped identify 23,564 victims worldwide at the time of writing.
The Commission has also financially supported the adoption of the Barnahus model of child-friendly, multidisciplinary protection of child victims during criminal proceedings, which includes limiting the number of interviews of child victims and conducting them by trained experts, as a standard in the EU.
·How CSA-related responsibilities are distributed between EU and Member States: the Commission manages the funding instruments mentioned above. That said, part of the Internal Security Fund is managed by Member States under the supervision of the Commission, and Member States also contribute own funding to the efforts, to a varying extent.
·How the proposed legislation builds on and interacts with existing funding mechanisms: the creation of the EU Centre requires dedicated EU funding, and no changes will be made to existing funding mechanisms. However, increased coordination and cooperation in prevention efforts facilitated by the EU Centre may also result in more targeted and higher-quality proposals during future funding rounds.
Relevant Sustainable Development Goals (SDGs)
The most relevant SDGs for this initiative are 5.2., eliminate all forms of violence against women and girls, and 16.2., end abuse, exploitation, trafficking and all forms of violence against children.
Other SDGs of particular relevance are those that address risk factors of CSA, such as SDG 1 on poverty (e.g. children forced by their parents to be sexually abused online), SDG 3 on health (e.g. given the short and long-term negative health consequences of CSA on children), SDG 4 on education (e.g. prevention campaigns to raise awareness of CSA online risks), and SDG 9 on industry, innovation and infrastructure (e.g. as the initiative aims to support service providers efforts to fight against CSA online, including through the EU Centre).
2.Problem definition
Table 1 shows the intervention logic (problem, drivers, objectives and options) that will be described and analysed in the impact assessment:
Table 1: problem, problem drivers, objectives and options (intervention logic)
Problem
|
Problem drivers
|
General
objective
|
Specific objectives
|
Options
|
|
|
|
|
Non-legislative
|
Legislative
|
|
|
|
|
A
|
B
|
C
|
D
|
E
|
Some child sexual abuse crimes are not adequately addressed in the EU due to challenges in their detection, reporting and action by relevant services providers, as well as insufficient prevention and assistance to victims. Diverging national responses negatively affect the Internal Market
|
1.Voluntary action by online service providers to detect online child sexual abuse has proven insufficient
2.Inefficiencies in public-private cooperation between online service providers, civil society organisations and public authorities hamper an effective fight against child sexual abuse
3.Member States’ efforts to prevent child sexual abuse and to assist victims are limited, divergent and lack coordination and are of unclear effectiveness
|
Improve the functioning of the Internal Market by introducing clear, uniform and balanced EU rules to prevent and combat child sexual abuse
|
1.Ensure the effective detection, removal and reporting of online child sexual abuse where they are currently missing
2.Improve legal certainty, transparency and accountability and ensure protection of fundamental rights
3.Reduce the proliferation and effects of child sexual abuse through harmonisation of rules and increased coordination of efforts
|
Practical measures to enhance prevention, detection, reporting and removal, and assistance to victims, and
establishing an EU Centre on prevention and assistance to victims
|
Option A
+
legislation
1)specifying the conditions for voluntary detection,
2)requiring mandatory reporting and removal of online child sexual abuse,
3)expanding the EU Centre to also support detection, reporting and removal
|
Option B
+
mandatory detection of known child sexual abuse material
|
Option C
+
mandatory detection of new child sexual abuse material
|
Option D
+
mandatory detection of ‘grooming’ (solicitation of children)
|
2.1. What is the problem?
2.1.1. Definition and magnitude
The problem that this initiative tackles is that providers of certain online services offered in the EU face divergent rules at national level when it comes to their responsibility for preventing and combating child sexual abuse on their services. At the same time, the existing responses at national level to some child sexual abuse crimes are proving insufficient. Challenges persist in detection, reporting and action by relevant service providers, as well as insufficient prevention, assistance to victims and cooperation. The divergence of national responses to the problem creates legal fragmentation which negatively affects the Internal Market.
Prevalence
At least one in five children falls victim to sexual violence during childhood. A global study of childhood experiences in 2021 found that than one in three respondents (34%) had been asked to do something sexually explicit online during their childhood, and more than half (54%) had experience a form of child sexual abuse online. A recent survey in Spain concluded that two out five Spanish adults suffered sexual abuse when they were children.
The majority of victims are girls, who are more than twice as likely to be abused than boys.
Vulnerable children are more likely to fall victims of CSA online. In a recent survey about childhood experiences:
·59% of respondents who identified as transgender and non-binary experienced online sexual harm, compared to 47% of cisgender respondents;
·65% of respondents who identified as LGBQ+ experienced online sexual harm, compared to 46% non-LGBQ+ people;
·57% of disabled respondents experienced online sexual harm, compared to 48% of non-disabled respondents.
“Offline” and online CSA
The sexual abuse of children can take multiple forms, both offline (e.g. engaging in sexual activities with a child or exploiting a child for prostitution) and online (e.g. forcing a child to engage in sexual activities via live streaming, or viewing or distributing online child sexual abuse images and videos).
The offline and online aspects of the crimes have become increasingly intertwined, and most CSA cases today contain an online component. For example, an offender may abuse a child offline, record the abuse, and share it online. Or the offender may establish a first contact with children online and then lure them to meet offline and sexually abuse them. It is therefore not possible to separate categorically between online and offline.
That said, this initiative focuses on the online aspects of the crime with relation to detection, reporting and removal efforts, in particular by the providers of the services used. This is because the internet has become the main medium for sharing CSAM, as well as for contacting children with the aim of abusing them. The internet facilitates the creation of communities in which offenders share materials and experiences. The volume of CSAM shared online has grown exponentially in the last years, while sharing of such material offline, e.g. via mail services, remains at a very low level and was not signalled as a common issue encountered by law enforcement in CSA investigations during stakeholder consultations.
The Member States have sought to address this growing phenomenon through rules at the national level, reinforcing existing legislation or adopting new rules to improve the detection and follow-up on online child sexual abuse. This has inadvertently created a fragmentation of the internal market which negatively impacts the provision of certain online services, while at the same time failing to stem the proliferation of this particularly harmful content. Therefore, this initiative addresses the detection, reporting and removal in the online sphere, which enables and fuels offline and online abuse, as well as on prevention and assistance to victims, where the online and offline aspects are also closely related.
Interlinkages between detection, reporting and action, prevention, and assistance to victims
In addition to the online-offline interlinkages, all the different areas of the problem are also closely related: detection, reporting and action (i.e. follow up to the reports, including removal by service providers and action by law enforcement), prevention, and assistance to victims. In general, for public authorities to be able to act and assist the victim, the crime has to be detected and reported, which in turn may prevent future crimes from happening (e.g. if the offender is arrested and the victim is rescued). This also applies to detecting grooming and to stopping the circulation of CSAM (known and new), which are both criminal behaviours. In addition, the continued circulation of CSAM has a particularly harmful societal impact: the distribution of CSAM is a form of re-victimisation that occurs every time the images and videos are seen. The knowledge that the images and videos are being distributed is a continuous source of distress for victims. In addition, viewing of CSAM can lead to hands-on abuse as it supports potential offenders in normalising and rationalising their behaviour; recent surveys even indicate that this may often be the case
. When CSAM is detected by service providers and investigated by law enforcement, it frequently leads to stopping ongoing or future abuse of child victims by the offenders caught distributing CSAM and/or grooming the child (see box 1 below).
Box 1: importance of detection, reporting and action in prevention and assistance to victims
The distribution of CSAM is closely linked to its production, and therefore physical sexual abuse of children. The detection and reporting of CSAM is therefore a key prevention tool and an important way to assist victims by also preventing re-victimisation.
The detection of CSA online frequently leads to stopping ongoing or future physical sexual abuse. This is clearly the case for new CSAM and grooming, which often reveals ongoing and/or imminent physical sexual abuse. But it is also the case for known CSAM, as viewing it often leads to hands-on abuse. In an anonymous online survey in the Darkweb, 37% of individuals who viewed CSAM had sought direct contact with a child after viewing the material. Also, half of the offenders sentenced in the US in 2019 for CSAM related offences (non-production) engaged in aggravating sexual conduct prior to, or concurrently with, the CSAM charge. The detection of CSAM also stops its distribution, which fuels demand for more and new material and therefore new abuses. Offenders not only exchange CSAM bilaterally but are typically required to contribute with new material to join online communities trading it. 44% of offenders convicted in the US for CSAM-related offences (non-production) participated in an online community, 77% required sentencing enhancements for possession of 600 or more images. The material demanded has become more and more extreme. In the same 2019 US data, 52% of cases included images or videos of infants or toddlers and 84% of cases required sentencing enhancements for images depicting sadistic or masochistic conduct or abuse of an infant or toddler.
|
Detection, reporting and action
The proportion of cases where CSA is discovered in a timely manner and prevented or stopped is very limited. Oftentimes, children do not manage to seek help themselves, and those in their ‘circle of trust’ (i.e. family and other close contacts), in charge to provide protection and care, are often the abusers. One in three victims will never tell anyone and at least four in five CSA cases are not reported to public authorities. There are indications that the COVID-19 crisis has exacerbated the problem, especially for children who live with their abusers.
In this context, online service providers and in particular ‘online intermediaries’ such as messaging services, online forums, and online platforms (such as video-sharing and media-sharing platforms, social networks, etc.) have acquired an important role.
First, online intermediaries are often the only ones to have any possibility to detect the ongoing abuse. Frequently, the abuse is only discovered thanks to the efforts of online service providers to detect CSAM on their services, and to protect children from being approached by predators online. The key role of these reports is evidenced by the fact that in some Member States, up to 80% of investigations are launched due to reports from service providers. This is particularly the case in electronic (private individual or group) communications, which offenders frequently use to exchange CSAM and approach children, where the service provider is the only one that can detect the abuse. It is reflected in recent statistics showing that the vast majority of reports (more than 80% in 2020, up from 69% in 2019) originate in interpersonal communication services (e.g. messenger applications and email), and surveys. In a recent one, two-thirds of respondents who received sexually explicit material online as children from an adult they knew or someone they did not know, received it through a private messaging service (68%), most commonly on their own personal mobile device (62%).
Secondly, the internet has also given offenders a new way of approaching children. They contact children on social media, gaming platforms and chats and lure them into producing compromising images of themselves or into offline meetings. In addition, children are spending more time online than ever before, increasing the risk of coming into contact with online predators.
Third, offenders frequently record the sexual abuse for repeat viewing and sharing. Where CSAM is shared online, the harm is perpetuated. The exponential development of the digital world has facilitated the global sharing of materials and the creation of networks of offenders via online intermediaries. The images and videos of CSA continue to circulate long after the abuse itself, and survivors often find themselves powerless to ensure removal of online content depicting their abuse. In some cases, offenders continue to traumatise victims long after the abuse has taken place by creating fake accounts with the actual names of the victims. These accounts typically do not contain illegal content but they attract offenders familiar with the CSAM depicting those victims, who discuss the past abuse and the current personal information of the victims (e.g. where they live, work or family situation).
It is estimated that, at any given moment, across the world there are more than 750 000 individuals online exchanging CSAM, streaming live abuse of children, extorting children to produce sexual material or grooming children for future sexual abuse.
The problem and problem drivers considered in the impact assessment apply to the three main types of abuse: known CSAM, new CSAM and grooming, also referred to as a whole as CSA online.
Box 2: current system to detect and report CSA online in the EU
The CSA detection efforts of online service providers fall into three categories: first, the detection of ‘known’ CSAM, that is, images and videos that have been reported or detected before and that have already been verified as constituting CSAM; secondly, the detection of ‘new’ CSAM, i.e. images and videos that have not previously been detected and verified; and third, the detection of ‘grooming’ (also referred to as solicitation of children), where offenders trick or threaten children into sharing compromising images or meeting them offline for the purposes of sexual abuse.
Currently, EU legislation allows certain online communication services like instant messenger and email to continue voluntary measures to detect and report child sexual abuse online, provided that their activities are lawful and, in particular, meet a set of specific conditions. In general, the measures that providers take vary widely and proactive detection of CSA online is still a rarity among service providers active in the EU.
The vast majority of CSA reports from service providers reaches law enforcement authorities in the EU through the US National Centre for Missing and Exploited Children (NCMEC), which is therefore of key importance for the fight against CSA in the EU. While US law does not oblige providers to detect CSA online in their services, it does oblige service providers to report it to NCMEC where they become aware of the abuse. NCMEC determines the relevant jurisdiction(s) from where materials were uploaded. Where the report relates to an EU Member State, the report is forwarded to the US Department of Homeland Security Investigations (HSI) for onward transfer to Europol, or directly to the relevant EU Member State law enforcement authorities. HSI plays an intermediary role as currently Europol cannot receive information directly from private parties, including NCMEC or service providers. Reports which are received by Europol are cross-checked and forwarded to the relevant Member State authorities. For reports relating to the US, NCMEC is able to provide a number of additional services, such as verifying that the reported content constitutes CSA according to the definitions under US law, and providing information on where the same content has been detected previously. This service cannot be provided for non-US reports due to the much higher volumes (in 2020, 98% of the reports were non-US related).
NCMEC has also a hotline function to receive reports from the public (independent from the above reporting by online service providers). It is part of the INHOPE network of national hotlines, which includes hotlines in most EU Member States where users can report CSAM that they may encounter accidentally; the hotlines then forward these reports to law enforcement and contact relevant providers to ensure removal. However, such reports from the public make up less than 2% of content found as it is rare for people to come across CSAM and report it. The INHOPE hotlines facilitate the takedown of CSAM hosted outside the territory of the country where it is reported by identifying the country where the material is hosted and forwarding the information to the relevant hotline in that country for further notification to public authorities, or to the service provider if no hotline exists.
|
While still only very few companies engage in voluntary detection of child sexual abuse, the past few years have nonetheless seen a strong increase in reports of CSA online submitted by online service providers globally through NCMEC: from 1 million reports in 2010 to over 21 million in 2020. The number of reports concerning the EU (e.g. images exchanged in the EU, victims in the EU, etc.) has also dramatically increased: from 17 500 in 2010 to more than 1 million in 2020.
Figure 1: EU-related reports submitted by online service providers, 2010-2020
Box 3: new CSAM and self-generated content
Part of the increase in new CSAM is driven by self-generated child sexual abuse material. IWF reported a 77% increase from 2019 to 2020 globally. Whereas the first time the material is shared may be consensual, further resharing is typically not consensual. In a 2020 survey conducted by Thorn, 1 in 6 children aged 9 to 12 admitted that they had seen non-consensually reshared nudes of other children, up from 1 in 9 in 2019. A separate survey by Economist Impact of 18-20 year olds on their childhood experiences found similar data: 18% of them reported experiencing a sexually explicit image of themselves being shared by a peer without consent.
First time sharing of self-generated material may be consensual but it may also be the result of online grooming. In the same survey conducted by Thorn, 50% of the children aged 9 to 17 said that they had sent the nudes to someone they had never met in real life, up from 37% in 2019.
The amount of grooming cases reported globally increased by 98% in 2020 compared to the previous year (37 872 in 2020 vs 19 147 in 2019), presumably due to the pandemic, when both children and offenders spent more time online and at home.
The reports that service providers submitted in 2020 in relation to cases in the EU included 3.7 million images and videos of known CSAM, 528 000 images and videos of new CSAM, and more than 1 400 grooming cases.
Reports indicate that some companies active and with servers in the EU have now become the largest hosts of CSAM globally (from hosting more than half of all CSAM detected in 2016 to 85% in 2020, with 77% in the Netherlands).
Given the worsening situation, Member States have started to take action unilaterally, adopting sectoral rules to deal with the challenge, which are necessarily national in scope and risk further fragmenting the Internal Market (see problem driver section 2.2.2.).
Stakeholders’ views
EU citizens are concerned about these developments. 93% consider important the principle that children should be protected in the online environment, with 73% of respondents considering this principle very important for inclusion in a potential future list of EU digital principles.
Prevention
Prevention is an essential component for tackling the problem at its roots.
There are two main types of prevention efforts:
1.Prevention efforts focused on children and their environment and on decreasing the likelihood that a child becomes a victim. Examples include awareness raising campaigns to help inform children, parents, carers and educators about risks and preventive mechanisms and procedures, as well as training, and efforts to detect and stop online grooming.
2.Prevention efforts focused on potential offenders and on decreasing the likelihood that a person offends. Examples include prevention programmes for persons who fear that they might offend, and for persons who have already offended, to prevent recidivism.
Setting out effective prevention programmes remains challenging. Resources are limited and lack coordination, and efforts, where present, are rarely evaluated to assess their effectiveness. (see section 2.2.3. on problem drivers).
Assistance to victims
Assistance to victims is essential to mitigate the harm and severe consequences for children’s physical and mental health caused by child sexual abuse (see section 2.1.3).
Victims require both immediate and long-term assistance, before, during and after criminal proceedings and taking into account the best interests of the child. This assistance must be specific, i.e. following an individual assessment of the special circumstances of each particular child victim, taking due account of the child’s views, needs and concerns.
However, immediate and long-term assistance remains limited, not sufficiently coordinated between relevant actors within and between Member States and of unclear effectiveness (see section 2.2.3.). This leads to information gaps, hampers the sharing of best practices and lessons learnt and decreases the efficacy of efforts.
2.1.2. Why is it a problem?
The fact that some child sexual abuse crimes are not adequately addressed in the EU is a problem because it results in victims not being rescued and effectively assisted as soon as possible, children being less protected from crimes, and offenders enjoying impunity. It affects public security in the EU and infringes children’s fundamental rights under the Charter of Fundamental Rights of the EU (Charter), including the right to such protection and care as is necessary for their well-being, the right to human dignity and the right to privacy. The continued presence and dissemination of manifestly illegal images and videos online, and the very heterogeneous approach of service providers, affects private and public interests, hampering trust, innovation and growth in the single market for digital services, in particular due to the fragmentation created by divergent national approaches trying to address the problem of CSA online (see problem driver section 2.2.2.).
Additionally, CSA has societal and economic costs. In particular, it contributes to an increased risk of serious mental and physical health problems across the lifespan, and exerts a substantial economic burden on individuals, families, and societies. There are negative consequences at all stages:
·Before the crime is committed: in the absence of proper preventative interventions, individuals who could have been stopped from abusing children may become first-time offenders, offenders are more likely to re-offend, and children are more likely to become victims if they and their carers lack awareness of the threat when using online services.
·While the crime is being committed: the consequences of not detecting and addressing the crimes swiftly include prolonged suffering and harm for victims. In addition, it reinforces the perception of impunity, reducing deterrence and facilitating further offending.
·After the crime has been committed: the consequences of not acting effectively after the crime include the inability to provide proper immediate and long-term assistance to victims, with negative effects for victims and society as described above. In addition, it may not be possible to prosecute offenders, which reduces opportunities for rehabilitation before, during and after criminal proceedings to prevent reoffending.
2.1.3. Who is affected and how?
First, children in the EU and elsewhere, who may fall victim to sexual abuse and suffer its negative effects, both in the immediate and long-term. Immediate effects include physical injuries and psychological consequences (e.g. shock, fear, anxiety, guilt, post-traumatic stress disorder, denial, withdrawal, isolation, and grief), sexual behaviour problems and over-sexualised behaviour, academic problems, substance abuse problems, increased likelihood of involvement in delinquency and crime, and increased likelihood of teen pregnancy. Long-term effects include psychological and social adjustment problems that can carry over into adulthood and affect married life and parenthood. They include negative effects on sexual and overall physical health; mental health problems including depression, personality and psychotic disorders, post-traumatic stress disorder, self-mutilation, attempted or completed suicide; and relational and marital problems including fear of intimacy and spousal violence.
Secondly, online service providers. Member States’ efforts to tackle the challenge at national level create distortions in the single market for digital services (see problem driver section 2.2.2.), as providers have to comply with sector-specific rules under national laws at least in some of the jurisdictions where they are active, resulting in a more challenging business environment for companies, in particular for smaller companies that are already facing difficulties of competing with their largest counterparts.
Third, users of online services. The detection, reporting and removal of CSA online currently lacks clarity, legal certainty and transparency. As a consequence, the rights and interests of users can be negatively affected. This can occur, for instance, in relation to unjustified reporting or removals, which may affect not only the users initiating the communications in question but also those at the receiving end. The existing uncertainty may also have a ‘chilling effect’ on legitimate forms of communications or hamper the full participation of children in online services as their parents and carers become more and more aware of the risks but do not have access to transparent information about the levels of risk and about what measures services take to protect children.
Fourth, governments and public authorities. The competent public authorities (e.g. law enforcement or governments at national, regional and local levels) dedicate significant resources to act against CSA. In particular, they put in place prevention programmes and measures to assist victims, and conduct investigations after they become aware of possible CSA. Inefficiencies in the current system lead them to seek local solutions to incentivise and obtain more information from providers.
Finally, society in general, given that CSA has consequences not only for the victims, but also for society as a whole. Social costs correspond to the non-monetary consequences of the criminal acts, and include diminished quality of life for society and increased feelings of insecurity among individuals. Economic costs include those of police and judicial services (e.g. criminal prosecution, correctional system), social services, victim support service and victim compensation programmes, education, health, and employment costs.
Box 4: estimated costs of child sexual abuse
Victims of child sexual abuse require immediate and long-term assistance. The costs of providing such assistance can be significant. For example, the total lifetime costs of assistance to victims arising from new substantiated cases of child sexual abuse in the United States in 2015 was estimated at USD 1.5 billion per year.
The long-term effects of child sexual abuse on victims also include lifelong loss of potential earnings and productivity. The total lifetime cost of such losses arising from new substantiated cases of CSA in the US in 2015 was estimated at USD 6.8 billion per year.
Overall, the total estimated costs of child sexual abuse in the US in 2015 were estimated at USD 11 billion per year.
2.2. What are the problem drivers?
2.2.1. Voluntary action by online service providers to detect online child sexual abuse has proven insufficient
Voluntary action varies significantly among companies
Online service providers are often the only entities capable of detecting that abuse involving their services is taking place. Because detection is voluntary, some online service providers take comprehensive action, others take some action, and there are providers that do not take any action against CSA at all. In addition, service providers often do not have access to reliable information on what content and behaviour is illegal in the EU to facilitate accurate detection, proactively and voluntarily, resulting in a risk of both over- and underreporting.
There are currently 1 630 companies registered to report to NCMEC, which is the main entity to receive reports of proactive searches that companies perform on their system, and the de facto global clearinghouse of reports of CSA online. This is a fraction of the online services used to commit these crimes. In 2020, of these 1 630 companies, one, Facebook, sent 95% of reports, 5 sent 99% of reports, and only 10% sent one report or more. There is no evidence that 95% of all the current cases of CSA online (including sharing of known and new CSAM, and grooming) occur on the services of that single company. In fact, experts agree that comparable levels of abuse occur in similar services from other companies, and the difference in detection levels is rather due to the different intensity of detection efforts. For example, some providers may make efforts to detect abuse only in certain services they provide, or may make efforts to detect only certain types of abuse. This would mean that there is a substantial amount of CSA online that remains undetected.
Figure 2: breakdown of reports submitted by online service providers globally in 2020
In addition, a number of service providers take action against users for suspected sharing of CSAM, e.g. by banning user accounts, but do not report. For example, WhatsApp indicates that it bans around 300 000 accounts per month for this reason alone. However, it has been reported that WhatsApp reports to NCMEC only about 10% of these cases, as the evidence recovered is circumstantial only and in line with US legislation is insufficient for a criminal investigation. Where that is so, there is on the one hand a risk that users are banned on the basis of unclear and potentially insufficient evidence, while on the other hand actual abuse may not be reported and investigated. This can have a significant negative effect on the fundamental rights of users, and on the affected children.
These different approaches and the related risks also create asymmetries in the single market for digital services, as they have prompted a number of Member States to adopt or consider national legislation to create a stronger and more effective approach (see problem driver section 2.2.2).
Voluntary action is susceptible to changes in companies’ policies.
Because detection is voluntary, companies may decide to change their policies at will. One example is Facebook’s decision to implement end-to-end encryption (E2EE) on its private messaging service by default.
Existing detection efforts risk being severely hampered by the introduction of encryption in online services, which in spite of its benefits for cybersecurity and the protection of users’ fundamental rights, such as freedom of expression, privacy, and data protection, also makes the detection of CSA online and the protection of fundamental rights of the victimised children more difficult, when not impossible.
Box 5: end-to-end encryption, a policy change impacting child sexual abuse detection
In March 2019, Facebook announced plans to implement end to-end encryption (E2EE) by default in its instant messaging service. These plans have been reiterated afterwards, with the implementation taking place “sometime in 2023”. In the absence of accompanying measures, it is conservatively estimated that this could reduce the number of total reports of CSA in the EU (and globally) by more than half, and as much as two-thirds. These estimates were confirmed after Facebook announced that it had stopped the detection of CSA in its instant messaging service in December 2020, given the legal uncertainty it considered to be caused by the entry into force of the European Electronic Communications Code (see the information on the Interim Regulation in section 1). From 1 January to 30 October 2021 the number of reports received by law enforcement in the EU dropped by two-thirds compared to the same period in 2020 (972,581 reports vs 341,326 reports), a loss of 2 100 reports per day. In total in 2021, while there was a 35% increase in global reports, the number of reports relevant for the EU dropped by 47%. Whereas in this case the tools to detect CSA were not used due to legal concerns, the practical effects are likely the same as an implementation of E2EE without mitigating measures would cause: the impossibility to detect CSA, since the detection tools as currently used do not work on E2EE systems.
Google announced in November 2020 that it had started to roll out E2EE on Google Messages. Other similar services with E2EE already incorporated (with presumably similar if not higher levels of CSA) include WhatsApp, Apple’s iMessage, Signal and Telegram.
|
In addition to affecting the detection of CSA online and the protection of fundamental rights of the victimised children, the use of E2EE without mitigating measures reduces the means to prevent and combat CSA overall by “turning-off the light” on a significant part of the problem, i.e. decreasing the evidence base, including data on the scale of detectable CSA online, which is essential to fight against overall CSA effectively through assistance to victims, investigations, and prevention. In the absence of mitigating measures (e.g. tools that can detect CSA online in E2EE systems, see annex 9), currently the possible ways to detect CSA online in E2EE systems are:
1)user reports, i.e. either the child or the offender reports the abuse; and
2)metadata, i.e. the time of the online exchange, the user names, and data related to the online exchange other than its content. This also includes suspicious patterns of activity (e.g. if someone repeatedly sets up new profiles or messages a large number of people they do not know).
Relying on user reports implies that the responsibility of reporting will be borne solely by child victims of sexual abuse in grooming cases, who in many cases are shamed or threatened into silence (see section 2.1.1. on underreporting), as the offender will obviously not report the abuse. This is already evident from the low number of user reports today.
Service providers do not consider metadata as an effective tool in detecting CSAM. In addition, the use of metadata is usually insufficient to initiate investigations. Moreover, it is likely to generate a much lower number of reports than the detection of content, despite the level of abuse being the same (if not higher). As an example, consider WhatsApp (E2EE and therefore uses metadata as the basis of detection) and Facebook Messenger (not E2EE and therefore uses content as the basis of detection). Whereas WhatsApp has around 50% more users than Facebook Messenger (2 billion vs 1.3 billion), and therefore, presumably, higher level of abuse proportional to the number of users, there were around 35 times less reports from WhatsApp than from Facebook Messenger submitted to NCMEC in 2020 (400 000 vs 14 million).
Europol reports that the widespread use of encryption tools, including E2EE apps, has lowered the risk of detection for those who offend against children. Offenders are well aware of the possibilities that E2EE present to hide their abuse. In an analysis of offender forums in the Darkweb, it was found that a majority of discussions focused on topics such as technical tools for direct messaging or how to securely acquire and store content.
Voluntary action leaves decisions affecting fundamental rights to service providers and lacks harmonised safeguards
A voluntary system leaves private companies to make fundamental decisions with significant impact on users and their rights. The challenges in this system are particularly evident when dealing with CSA, where there are fundamental rights and interests at stake on all sides – including the right to protection of their well-being and to privacy on the side of the child, the right to privacy and freedom of expression and information for all users. As a result, if the rights of the child are deemed important enough to justify interfering with the rights of all users and of service providers, then it may not be appropriate to leave the decision on whether and if so, how to do so to the service providers.
In addition, the current voluntary action by online service providers to detect CSA online lacks long-term perspective and harmonised safeguards applicable to all relevant service providers, including transparency. This is especially important as some of the voluntary measures that companies decide to take may interfere with users’ rights, including those to privacy and data protection. It is unclear which tools are in use and how they are used, or which procedures are in place to improve the tools and limit the number of false positives. While there is an obvious need not to warn off perpetrators or inadvertently provide guidance on how to avoid detection, there may be room for more information. As a result, users at present may have no effective redress in case of erroneous removals; the possibilities of scrutiny are limited; and there is no effective oversight by regulators. In addition, the existence and effectiveness of procedural safeguards differs widely across providers.
The Interim Regulation introduced a number of safeguards, such as annual transparency reports, consultation with data protection authorities on their processing to detect CSA online, and complaint mechanisms, so that content that has been removed erroneously can be reinstated (see section 1).
A number of important safeguards are contained in the DSA proposal, which lays down harmonized transparency requirements in case of content moderation based on providers own initiative, as well as in relation to mechanisms for removal and related user complaints.
Given the gravity of impact on both sides – for the child victims, materials depicting their abuse, and the risk of (further) abuse, and for the suspected user, an accusation of having circulated CSAM – the above safeguards form an important baseline but do not go far enough in the present context. In particular, the stakeholder consultations have shown the importance of a universal reporting obligation for CSA online for the providers, using dedicated secure and fast channels, as well as of additional requirements on the technologies employed for automatic detection to ensure that they are both effective in detecting abuse and also limit the number of false positives to the maximum extent technically possible.
Voluntary action has failed to remove victims’ images effectively
Victims are left on their own when images and videos of their abuse end up online. Under national criminal laws, hotlines in the EU are in principle not allowed to proactively search for images and videos of a given victim, on the victim’s behalf, to effect removal. For the same reason, victims themselves are also prohibited from searching for their own images and videos, as the possession of CSAM is illegal per se. Absent a requirement for relevant services providers to take proportionate measures to detect, report and remove specified content, an effective removal system has not developed.
Box 6: Voluntary principles to counter online child sexual abuse
The US, UK, Canada, Australia and New Zealand (the ‘Five Eyes’), together with leading online service providers, civil society and academia, announced in 2020 a set of voluntary principles for companies to tackle child sexual abuse online. These address notably the detection, reporting and removal of CSAM, as well as detection and reporting of grooming.
Although multiple companies have committed to implementing the voluntary principles, including Facebook, Google, Microsoft, Roblox, Snap and Twitter, there is a lack of transparency on the actions that companies are taking to implement those principles. As a consequence, there is a lack of evidence of tangible results of that commitment.
|
2.2.2. Inefficiencies in public-private cooperation between online service providers, civil society organisations and public authorities hamper an effective fight against CSA
This section describes the inefficiencies in public-private cooperation between the main actors in the fight against CSA, online and offline. In a majority of cases, the inefficiencies relate to regulatory issues.
Cooperation between public authorities and service providers
Cooperation between public authorities and service providers is of critical importance in the fight against CSA, particularly in relation to service providers’ efforts to detect and report CSA online and remove CSAM.
·Legal fragmentation affecting the Internal Market
Currently, although obligations under national law are increasingly introduced, companies offering online services in the EU still detect, report and remove CSA online from their services on a voluntary basis. There are at present no effective procedures under EU law for service providers to report to public authorities or to exchange information in a timely manner or swiftly react to requests and complaints. This hampers investigations and creates obstacles to addressing CSA and to protecting victims.
This has led to a number of Member States preparing and adopting individual legislative proposals at the national level to create stricter rules for providers who fail to cooperate with public authorities or do not put in sufficient efforts to detect and report CSAM. Some Member States adopted new legislation as recently as 2021 (e.g. Germany, Austria) and others are currently preparing legislative proposals (e.g. Germany, France, the Netherlands) (see Annex 5). These efforts often involve establishing dedicated public authorities or designating existing authorities to enforce the new rules, as well as strict time-limits for service providers to remove CSAM upon becoming aware, subject to fines if they fail to do so. At the same time, the reach of these efforts varies and they are constrained by the national laws of the Member States. The scope of relevant national laws and their obligations differ in terms of the services covered. For instance, some focus on social networks in general, others on hosting providers managing websites containing illegal content and yet others on online platforms above a certain threshold (e.g. number of registered users and annual revenue). Approaches are by nature limited to national jurisdictions. Given the cross-border nature of the Internet, and by implication many service providers operating online as well as online CSA, such a fragmented approach hampers the proper functioning of the internal market. Moreover, such a fragmented approach cannot ensure the effective detection, reporting and removal of CSAM and the fight against grooming across the EU, beyond the borders of individual Member States having the above-mentioned national legislation in place. Compared to one horizontal framework established at EU level, such a Member State-based approach increases the costs of doing business in the EU as service providers have to adapt to various different sets of rules, which creates uncertainties and challenges in particular for smaller providers seeking to expand to new markets in the EU, and can stifle innovation and competition.
Box 7: the CSAM issue in the Netherlands
As highlighted above, reports indicate that some service providers active and with servers in the EU have now become the largest hosts of CSAM globally, with more than half of all CSAM hosted in the Netherlands, given its strong internet infrastructure. The Dutch government has made several commitments to address this issue, including investing in partnerships between the Dutch Government and the private sector. This included a new free service called ‘Hash Check Service’ (operated by the EU co-funded Dutch INHOPE hotline EOKM) made available to companies to scan their servers for known CSAM. Given that there is a small group of Dutch companies that only cooperate to a lesser extent, and some companies not at all, the Netherlands is also preparing a new law to deal with companies that fail to cooperate. In the near future, companies will be under the supervision of a governing body that will have the authority to impose administrative sanctions on companies that fail to cooperate. In addition to criminal law, this procedure specifically aims to eradicate CSAM in a fast and efficient manner.
The national approaches create fragmentation on the Internal Market, hindering effective cooperation between public authorities and service providers in the fight against CSA. The continued presence and dissemination of CSAM, and the very heterogeneous approaches of service providers, affect both private and public interests, hampering trust, innovation and growth on the Internal Market (i.e. single market for digital services). Such fragmentation increases compliance and operational costs of the actions in the fight against CSA for stakeholders such as online service providers that operate in several Member States and may lead to legal uncertainty. Non-compliant service providers may move to and continue operating from Member States where national rules are less strict. Given the cross-border and international dimension of online service provision as well as child sexual abuse online, a patchwork of national measures does not effectively protect children, and creates distortions in the functioning of the single market for digital services.
The proposed Digital Services Act will not be able to reduce this fragmentation to the extent necessary, given its horizontal nature and the specific challenges posed by CSA (see section 5.1.). For example, the DSA would not create removal obligations. Some Member States have already gone farther, like Germany, which for certain providers such as social networks has imposed removal obligations by law, as well as reporting obligations in case of detection of CSAM, specifying the data to be reported to federal law enforcement, as well as an obligatory notification to the user and other aspects.
·Varying quality of reports
While reports from service providers via NCMEC have led to many cases of children being rescued from ongoing abuse, and of offenders arrested, law enforcement authorities estimate that only around 75% of reports they receive from service providers are actionable. The most common reason is that the report contains material that does not constitute child sexual abuse under the Member State’s law. This is largely due to a simple fact: US-based service providers report to NCMEC material that may constitute CSA under US law, which may include content that is not illegal in the EU and omit content that is illegal in the EU. For example, the CSA Directive leaves up to Member States to make illegal sexual abuse material involving individuals appearing to be a child but in fact older than 18, whereas US legislation requires that the material involve an “identifiable minor” to be illegal. On the other hand, the CSA Directive criminalizes grooming only when the child is below the age of sexual consent, whereas it is always illegal in the US for any person under 18.
Further challenges arise as a result of a lack of unified reporting requirements which clearly set out the information to be included in reports. While US service providers are obliged to make reports to NCMEC, much of the information to be included in the report is left at the discretion of the provider. The service that NCMEC provides for US-related reports (i.e. human review of the reports to ensure that they are actionable) is typically not available for EU-related reports, due to resource constraints. A lack of sufficient information is also one of the most common reasons cited by the law enforcement authorities of the Member States for a report not to be actionable.
·Lack of resources in law enforcement agencies
Absent the support provided by NCMEC to US authorities, each national law enforcement authority is left to its own devices when analysing CSAM, despite the support provided by Europol to help coordinate cases. This requires a significant investment of resources and makes it very difficult to deal effectively with the large amount of reports these authorities receive, and prevents an effective public-private cooperation against CSA.
·Lack of feedback from public authorities to service providers.
Currently, there is no mechanism for systematic feedback from law enforcement to companies on their reports. Where providers report content that is not illegal under the law of the relevant Member State, the provider is not made aware of that fact. This increases the likelihood of the provider reporting the same or similar content again in the future.
·Challenges due to the international and cross-border nature of CSA
There are several international and cross-border aspects to the fight against CSA online. In many cases, these are inherent in the cross-border nature of the Internet. As a result, a single incident of online abuse may involve perpetrators and victims located in multiple jurisdictions. While certain minimum standards relating to CSA crimes have been widely adopted in criminal law in many countries, and within the EU the CSA Directive contains specific requirements providing for a degree of harmonisation, specific national definitions and offences differ from one country to another.
In addition, long-standing difficulties with regard to cross-border access to electronic evidence pose a particular problem for the investigation of CSA online. Law enforcement frequently needs additional information during investigations from service providers, which are often located in another Member State, or in a third country. Existing judicial cooperation is too slow and direct cooperation between service providers and public authorities is unreliable, inconsistent and lacks transparency and accountability. Several legislative proposals and other ongoing initiatives aim to address these issues (see box 2 in Annex 6).
Furthermore, due to the existing legal framework and the often important or even dominant market position of US service providers, Member States are heavily dependent in their fight against CSA on reports received from a third country, the US, through NCMEC.
Cooperation between civil society organisations and service providers
·Cooperation challenges in notice and action procedures.
When they receive a notice from civil society organisations requesting them to remove content, service providers in more than 25% of cases refuse to take action to remove the notified content or take a considerable time period to do so. Whilst there can be justified reasons for not taking action or for some delays in individuals cases (for instance, because of uncertainty as to whether the notified content actually constitutes CSAM under the applicable laws), there is a particularly problematic group of providers known as ‘bulletproof hosting providers’, which refuse to assume any responsibility for content stored on their servers. It should be recalled that, at present, EU law does not provide for an obligation for providers to report or act upon notified content, not even where it manifestly constitutes CSAM. Under the eCommerce Directive (Art. 14) and the proposed DSA (Art. 5, see section 5.1.), hosting service providers’ failure to act expeditiously to remove or disable access to illegal content (including CSAM) would lead to loss of the benefit of the liability exemption. In such cases, the service providers may – but not necessarily will – be liable under the applicable national laws of the Member States, depending on whether these national laws provide for liability for service providers.
Cooperation between public authorities and civil society organisations
·Limited impact of hotlines’ action in the EU due to regulatory gaps.
Inability to search proactively. As noted, hotlines operating in Member States are under national criminal law in principle not allowed to search CSAM proactively. They therefore tend to rely exclusively on reports from the public, which are of limited number and fluctuating in quality. The number of user reports is significantly lower than those from proactive efforts, as the situations in which someone comes across CSAM unintentionally and reports it are limited. Also, user reports are often inaccurate, in particular compared with reports from proactive searches. For example, the only hotline that conducts proactive searches in Europe, IWF in the UK, reported that whereas about half of the reports it manages come from the public and half from proactive searches, only 10% of the total CSAM that it finds traces back to public reports vs 90% from proactive searches.
·Inefficiencies in cooperation on assistance to victims.
For long-term assistance to victims, there is room for improvement in the cooperation between public authorities and NGOs to ensure that victims are aware of the resources available to them. In addition, currently there is no cooperation between public authorities and hotlines or other NGOs to support victims at their request in searching and taking down the material depicting them.
·Inefficiencies in cooperation on prevention.
Inefficiencies in cooperation exist notably on prevention programmes for offenders and for persons who fear that they might offend. In some Member States, NGOs carry out these programmes with limited support from public authorities. In addition, the coordination between public authorities and NGOs on the programmes they respectively offer at different stages is also limited (e.g. between the programmes that public authorities offer in prisons and the reintegration programmes that NGOs offer after the offender leaves prison).
Cooperation between public authorities, service providers and civil society organisations
·Lack of legal certainty:
-For service providers. The Interim Regulation did not create an explicit legal basis for service providers to proactively detect CSA, and it only provided a temporary and strictly limited derogation from certain articles of the e-Privacy Directive to allow the continuation of the voluntary measures to detect CSA, provided that these are lawful. Whereas some service providers invoke legal bases provided for in the GDPR for the processing of personal data involved in them carrying out their voluntary actions to tackle CSA, others find the GDPR legal bases not explicit enough. The uncertainty thus deters some service providers from taking such voluntary action.
-For hotlines. The operation of hotlines is not explicitly provided for in EU law, and only five Member States explicitly regulate it, with others relying on memorandums of understanding. This leads to the inability of hotlines to assess the content of reports from the public in some Member States, or to notify the service provider directly, leading to fragmentation and ineffectiveness across the EU.
·Lack of operational standards:
Law enforcement agencies, online service providers and civil society organisations have separate systems and standards used in the detection, reporting and removal of CSA online. They vary not only between the different types of stakeholders (e.g. between law enforcement and service providers) but also between the same type of stakeholder (e.g. between law enforcement agencies in different Member States). This includes the use of multiple, differing databases of hashes used in the detection of known CSAM. This hampers the collective ability to efficiently and effectively detect, report and remove CSAM, to identify and rescue victims, and to arrest offenders.
Stakeholders’ views
Public authorities identified among the main challenges while investigating CSA cases: a) inefficiencies in public-private cooperation between service providers and public authorities, and b) inefficiencies/difficulties with access to evidence due to technical challenges. Over 80% referred to the increased volume of CSAM detected online in the last decade and further flagged that there are insufficient human and technical resources to deal with it. These same stakeholders state that a common baseline (also in terms of a common classification system and terminology) is required to support better law enforcement and judicial cooperation and information sharing consistent with the cross-border nature of offending in CSAM.
Civil society organisations stressed the need to improve cooperation between them and law enforcement authorities (74%) in the fight against CSA online (including by providing funding to enable cooperation, organizing joint trainings/meetings and ensuring better information sharing, as well as the need for legal recognition and a clear legal basis for the national hotlines). In addition, 73% of the respondents from civil society organisation pointed out that improved cooperation with service providers is needed.
Service providers highlighted the need for coordinated actions on a global level, and the importance of exchange of best practices.
2.2.3. Member States’ efforts to prevent child sexual abuse and to assist victims are limited, divergent and lack coordination and are of unclear effectiveness
Prevention efforts
·Limited.
In relation to the two main types of prevention efforts described in section 2.1.:
oPrevention efforts to decrease the likelihood that a child becomes a victim. Awareness raising
and training is limited in availability, particularly to organisations and persons that come in regular and direct contact with children as part of their jobs or vocational activities, in addition to carers and parents. A vast majority of the abuse occurs in the circle of trust of the child. At the same time, those in regular and direct contact with children should have the knowledge and tools to ensure that children do not become victims, given their proximity to the child.
oPrevention efforts to decrease the likelihood that a person offends.
Research into what motivates individuals to become offenders is scarce and fragmented. This current lack of research makes it difficult to put in place effective programmes before a person offends for the first time, in the course of or after criminal proceedings, both inside and outside prison. As a result, there are currently very few programmes in place
.
·Uncoordinated. Multiple types of stakeholders need to take action to enact a preventive approach that delivers results. This includes public authorities, the research community, NGOs, and providers of online services used by children. The various types of practitioners in this field do not communicate sufficiently with each other and with researchers on the effectiveness of the programmes, lessons learned and best practices; language can be a further barrier. Expertise and resources to establish and implement such initiatives are not evenly distributed in the EU, and successful programmes are mostly local endeavours. There are overlapping efforts in some areas, e.g. Member States designing similar programmes and campaigns in parallel
, whereas other areas, such as reaching out to potential offenders, are not sufficiently addressed.
·Unclear effectiveness. The few programmes that exist are rarely evaluated to assess their effectiveness and usability
. A recent systematic review of the published empirical literature on child sexual abuse perpetration prevention interventions found only five published evaluation studies, and these were methodologically limited (e.g. four examined the same intervention only on adults in Germany, and the other one focused only on children aged 5 to 12)
.
Assistance to victims’ efforts
·Limited. Victims of CSA do not always receive the tailored and comprehensive assistance required
, such as support in trying to stop the sharing and distribution online of the images and videos depicting their abuse, which perpetuates the harm.
·Uncoordinated. Victims of CSA require comprehensive support that brings together all relevant sectors, including health, legal, child protection, education and employment. Such coordination between relevant actors within and between Member States is lacking. The existing initiatives do not systematically make use of existing best practices and lessons learned in other Member States or globally. This translates into information gaps on help resources, gaps in specialised support, and overall inefficiency of efforts.
·Unclear effectiveness. There is little data on whether survivors have access to appropriate support, and existing research suggests that the level of satisfaction with support received is low
.
Box 8: main sources of evidence on current efforts on prevention and assistance to victims
The CSA Directive requires Member States to put in place prevention measures of programmes of the two main types described in section 2.1.1. (i.e. programmes focused on children or on possible offenders), as well as assistance to victims measures. The Commission has been monitoring the transposition of the CSA Directive since 2013, when the deadline for Member States to transpose it expired. One of the main challenges for Member States concern the transposition of the articles concerning prevention and assistance to victims
.
Member States have generally struggled to put in place the required prevention programmes or measures, in particular those for offenders and for people who fear that they might offend, as well as assistance to victims programmes. In some cases, these programmes have not been put in place yet and in others they are in place but they do not fully comply with the requirements of the Directive. The Commission organised six dedicated workshops in 2018 and 2019 to support Member States in the transposition of these and other provisions and better understand the challenges.
These workshops, together with additional bilateral exchanges between the Commission and Member States, revealed a need for more structured and continuous support, as some aspects of prevention and assistance to victims have not been traditionally an area of focus for Member States’ action in the fight against CSA. The shortcomings typically originate in a lack of expertise in relevant areas, as well as difficulties in communication and coordination between key actors, e.g. different ministries. In particular when it comes to measures targeting (potential) offenders, there remains significant room for improvement.
In addition to the evidence gathered through monitoring the transposition of the Directive and supporting its implementation, the feedback from stakeholders during the consultation activities, in particular NGOs focused on child’s rights, shows the need for improving awareness and education of children, parents, and caregivers. This feedback also included the need for improving the availability of effective prevention programmes for offenders and persons who fear that they might offend, as well as the assistance to victims’ programmes
.
2.3. How likely is the problem to persist?
The problem of CSA is likely to continue worsening, driven by the issues identified in the problem drivers section.
Children will continue to spend more time online and thus be more exposed to predators operating online. Similarly, predators will most likely also be spending more time online than before, as teleworking arrangements expand and become part of the post-pandemic new normal, and in response to the increase in opportunities to encounter children online.
Relevant services will continue to be misused for the purpose of CSA, in particular those that do not adopt meaningful voluntary measures. It is unrealistic to expect that, in the absence of incentives or obligations, the relevant service providers would implement sufficient voluntary measures, given that many have failed to do so to date despite the evident proliferation of CSA online. Images and videos will continue to stay online. Smaller players in particular will continue to be dissuaded by the lack of legal certainty. The fragemented legal framework can also lead to high compliance and operational costs for all service providers offering their services in the EU, since their obligations might differ and be more burdensome in one Member State than in another.
In the absence of EU action, Member States will see a need to step up and fill the gap, as some have already done or are in the process of doing. The increasing legal fragmentation concerning obligations on service providers to detect and report CSA online (known and new material and grooming) and to remove that material, as well as the uneven application of voluntary measures, would continue, in particular after the Interim Regulation expires. There are already inefficiencies in public-private cooperation between online service providers and public authorities (such as law enforcement authorities) in exchanging information in a timely manner or swiftly reacting to requests and complaints. This hampers investigations and creates obstacles to addressing child sexual abuse online and to protecting victims. Such inefficiencies would continue and potentially escalate as the overall volume of illegal activity and content grows
The current technical solutions used to detect CSA online do not function in E2EE electronic communications. It is likely that more service providers would incorporate end-to-end encryption without effective measures to protect children. Encryption is an essential tool for ensuring cybersecurity and the protection of users’ fundamental rights such as freedom of expression, privacy and personal data, but at the same time makes the detection of CSA online (and therefore the protection of fundamental rights of the child) much more difficult, if not impossible. This could result in more online ‘safe havens’ where offenders can freely exchange CSAM without fear of discovery and reprisal, normalise these crimes, actively encourage others to abuse children to generate new material, and where children may be groomed and abused online.
It is unlikely that, across the board, companies will unilaterally divert investment into developing technical solutions that allow reliable detection of CSA in encrypted systems, as well as a high level of privacy and protection of other fundamental rights, security against unauthorised access and transparency (see Annex 9 for a possible set of assessment criteria for these technical solutions). Deployment of these technical solutions would require financial resources to develop the solution for feasible deployment at scale and align it with companies’ current infrastructures. Smaller companies with limited resources are especially likely to encounter more difficulties, since work in this area is relatively novel and technical tools although available, must be tailored to the specific service.
An example of the development of these tools is the announcement of new ‘Child Safety’ initiatives by Apple. Apple is working towards deploying technical tools to detect known CSAM on users’ devices prior to encryption and storage in the cloud. The solution uses well-developed hashing technology to generate a hash of the image the user is uploading and match it against a database of hashes of verified CSAM (see Annex 8). This takes place on the user’s device prior to the image being encrypted, and does not interfere with the encryption safeguarding the transfer of data, preserving in this respect the privacy and security of data, and allowing detection of known CSAM.
However, a number of companies and privacy NGOs state that there is no possibility to deploy such tools to detect CSA in the context of encrypted electronic communications that would ensure protection of privacy and security of communications. While they do not interfere with the encryption as such, these tools are seen as violating the spirit of end-to-end encryption to the extent that it suggests a wholly private exchange where even illegal content is shielded, for the benefit of ensuring everyone’s privacy. It is therefore likely that spontaneous developments in encrypted communications that take into consideration children’s safety and privacy and all fundamental rights at stake will remain limited, given in particular the legal uncertainty and vocal opposition from some stakeholders.
As children will be increasingly exposed to predators online, prevention will play a particularly important role. Parents and children will need the knowledge and tools to protect themselves. Without a solid and structured approach to awareness raising and education to benefit children, parents and caregivers, children will continue to fall victim to sexual abuse in greater numbers. This concerns both online abuse, which may be followed by crimes committed offline, but it applies also to purely offline abuse. While awareness of the problem is currently on the rise in a number of Member States when it comes to abuse in organised sports or other activities targeting children, an effective and systematic prevention response is still lacking. Whether sexual abuse takes place offline or online, children will therefore often continue to lack information on where to seek help, and the adults around them will not be in a position to notice or remedy the problem.
On the opposite side of the problem, people who are attracted to children will continue using the online space to find victims. Those who may want to seek support to overcome this attraction will often not dare to come forward in fear of legal consequences and social stigma. Instead, they will likely continue to seek information online, and often become drawn in by other predators into committing crimes, rather than finding professional help. Therefore, initiatives addressing more apparent aspects of prevention, such as awareness raising initiatives, will not be enough to address the entire problem, and the CSA issue is likely to continue growing. While there are some initiatives that reach out to persons who fear they may offend, without EU-level support and coordination, they will likely continue to be limited, unevenly distributed and of varying effectiveness.
Increased online activity and consequent exposure of children to predators will unavoidably result in more victims. Victims will continue having difficulties to access long-term assistance. Without more developed support system in all EU Member States, the situation of victims will continue to vary. However, even in Member States with more advanced support systems, many victims will be left to face the psychological, physical and economic consequences of CSA without proper assistance, once the immediate proceedings around the crime are closed. In cases where the crime is never reported, victims and their families may not know where to seek help, or that they should be entitled to it.
Another problem that the victims will likely continue to face on their own are efforts to have their images and videos taken down swiftly and effectively. As this is rather a matter of practical action against illegal content rather than of harmonised criminal law, it could not adequately be addressed in a revision of the CSA Directive or the Victims’ Rights Directive, and it is too specific of a problem to have been included in the DSA proposal. As long as there is no proactive search for these images and videos, they will often stay online.
3.Why should the EU act?
3.1. Legal basis
In accordance with settled case law by the Court of Justice of the EU, the legal basis of a legislative initiative has to be determined in light of the content and aim of the envisaged measures. Given that these measures are in part still under assessment, at this stage, no definitive conclusions can yet be drawn in this respect.
That said, given the problems that this impact assessment is addressing and the solutions proposed, Article 114 TFEU was identified as the most likely legal basis for an EU intervention. Article 114 TFEU is the basis for measures which have as their object the establishment and functioning of the internal market. In particular, Article 114 is the appropriate legal basis to address differences between provisions of Member States’ laws which are such as to obstruct the fundamental freedoms and thus have a direct effect on the functioning of the internal market, and to prevent the emergence of future obstacles to trade resulting from differences in the way national laws have developed
.
This initiative aims to ensure the proper functioning of the internal market, including through the harmonisation of rules and obligations concerning certain online service providers in relation to providing services which are at high risk of being used for child sexual abuse and exploitation online. As highlighted above under Section 2.2.2, Member States have started taking action unilaterally, adopting or considering rules to deal with the challenge posed by child sexual abuse online, which are necessarily national in scope and risk fragmenting the Digital Single Market. This initiative aims to ensure common rules creating the best conditions for maintaining a safe online environment with responsible and accountable behaviour of service providers. At the same time, the intervention provides for the appropriate supervision of relevant service providers and cooperation between authorities at EU level, with the involvement and support of the EU Centre where appropriate. As such, the initiative should increase legal certainty, trust, innovation and growth in the single market for digital services.
Articles 82 and 83 TFEU, which constitute the legal basis for the CSA Directive, provide a basis for criminal law rules concerning, inter alia, the rights of victims of crime and the definition of criminal offences and sanctions in the areas of particularly serious crime with a cross-border dimension such as sexual exploitation of children. As the present initiative would not seek to harmonise criminal law, Articles 82 and 83 TFEU are not the appropriate legal basis.
3.2. Subsidiarity: necessity of EU action
A satisfactory improvement as regards the rules applicable to relevant online service providers active on the internal market aimed at stepping up the fight against CSA cannot be sufficiently achieved by Member States acting alone or in an uncoordinated way. In particular, a single Member State cannot effectively prevent or stop the circulation online of a CSA image or video, or the online grooming of a child, without the ability to cooperate and coordinate with the private entities who provide services in several (if not all) Member States. As presented above under Section 2.1., several Member States took, or in the process of taking, the initiative to adopt national laws in order to step up against the proliferation of CSA online. Although these approaches share the same objective, their way of achieving that objective is somewhat different, targeting for instance different types of services and introducing varying requirements and different enforcement measures.
In the absence of EU action, Member States would have to keep adopting individual national laws to respond to current and emerging challenges with the likely consequence of fragmentation and diverging laws likely to negatively affect the internal market, particularly with regard to online service providers active in more than one Member State (see problem driver section 2.2.2.). Individual action at Member State level would also fail to provide a unified system for cooperation in the fight against these crimes between public authorities and service providers, leaving them to deal with different legal systems and diverging rules instead of one harmonised approach.
This initiative would build on the DSA proposal, which creates a harmonised baseline for addressing all illegal content, to create a coherent system throughout the EU for the specific case of CSA content, which is characterised in particular by its non-public nature and the gravity of the crimes. Such a coherent system cannot be achieved at Member State level, as also set out in detail in the Impact Assessment accompanying the DSA proposal.
3.3. Subsidiarity: added value of EU action
Reduce fragmentation and compliance/operational costs, improving the functioning of the internal market
Legal fragmentation (divergence in national legislation to address these issues) increases compliance and operational costs of the actions in the fight against CSA for stakeholders such as online service providers that operate in several Member States and may lead to legal uncertainty in particular when the fragmentation also causes conflicts of laws. EU action would provide legal certainty and a coherent approach applicable to entities operating in several Member States, facilitating the scaling up and streamlining of their efforts in the fight against CSA and improving the functioning of the Digital Single Market.
Given the cross-border aspects of the problem, having regard to the inherent cross-border nature of the Internet and to the many services provided online, the number of policy areas concerned (single market for digital services policy, criminal law, economic issues, and fundamental rights including the rights of the child, freedom of expression, privacy and data protection), and the large range of stakeholders, the EU seems the most appropriate level to address the identified problems and limit legal fragmentation. As previously described, CSA, in particular in its online aspects, frequently involves situations where the victim, the abuser, and the online service provider are all under different national legal frameworks, within the EU and beyond. As a result, it can be very challenging for single countries to effectively define the role of and cooperation with online service providers without common rules and without fragmenting the Single Market (see problem driver section 2.2.2.).
Facilitate and support Member States’ action on prevention and assistance to victims to increase efficiency and effectiveness
While Member States are best placed to assess the gaps and needs, and implement action in their local context, they often lack information on what prevention and assistance to victims programmes are available, how effective they are, and how to approach their implementation in practice – who needs to be involved, what are the technical and legal pre-requisites and estimated costs. EU level action can provide a forum for exchange of necessary information and expertise to avoid duplication of efforts and blind spots. EU action can also help identify best practices and lessons learned at national level (from Member States or third countries) and incorporate them into EU-level initiatives, so that other Member States can benefit from them. This may also prevent a “whack-a-mole” effect in which a Member State successfully addresses a problem in its territory but the problem just moves to another Member State (e.g. hosting of CSAM online).
While some exchange in this area exists, the feedback from experts in the field indicates there is a need for a structured framework for such exchanges. EU level action promoting and disseminating research would help to enrich the evidence base in both areas and could possibly even link initiatives across Member States, boosting efforts. EU action could also include practical support to local interventions, e.g. translations of existing materials from another Member State, possibly leading to significant cost savings at national level.
The EU level action on prevention and assistance to victims at issue here would not impose any additional obligations beyond those included in the CSA Directive. Indeed, the main focus of the present initiative is on strengthening the functioning of the internal market by setting common rules aimed at combating the misuse of online services for CSA-related purposes. Nonetheless, the action could also contribute to facilitating and supporting Member States’ work to comply with the existing obligations, notably through the sharing of expertise and best practices benefitting from the central position it occupies in connection to its principal tasks regarding the detection and reporting of online CSA.
Reduce dependence on and facilitate cooperation with third countries
Currently, in practice, law enforcement authorities of the Member States depend almost entirely on NCMEC, a private organisation located in the US, as the main source of reports of CSA online. EU action could ensure, among others, that such dependence is reduced and that the detection, reporting and removal of CSA online is done through EU mechanisms that operate according to EU rules, including the necessary safeguards. In addition, EU mechanisms could be more closely linked to what is illegal in the EU and its Member States, rather than relying on definitions from third-country jurisdictions. This would enhance the precision of efforts, reduce the impact on third parties, and better target measures.
4.Objectives: What is to be achieved?
4.1. General objective
The general objective is to improve the functioning of the internal market by introducing clear, uniform and balanced EU rules to prevent and combat CSA, notably through imposing detection, reporting and removal obligations on certain online service providers.
4.2. Specific objectives
There are 3 specific objectives that address the problem drivers identified in section 2.2.:
1.Ensure the effective detection, reporting and removal of online CSA where they are currently missing. This specific objective is of particular relevance to problem driver 1, as the current voluntary action by online service providers and under diverging national laws is insufficient to effectively detect, report and remove CSA online across the EU, i.e. by not detecting some crimes or by not being effective in dealing with those detected. It is also of relevance to problem driver 2, since part of the current inefficiencies in the detection, reporting and removal process are due to inefficiencies in public-private cooperation.
2.Improve legal certainty, transparency and accountability and ensure protection of fundamental rights. This specific objective is of particular relevance to problem driver 1, as the current voluntary action by online service providers and the action taken under diverging national laws is not sustained on a clear, uniform and balanced EU-level framework that provides long-term legal certainty, transparency and accountability and ensures protection of fundamental rights. This objective therefore reflects the need to create a clear framework, with the appropriate safeguards to ensure respect for children’s rights and all users’ rights, including the right to freedom of expression, right to private life and communications as well as data protection, and to provide regular information about its functioning, including e.g. transparency reports on technologies used for the identification of CSA content.
3.Reduce the proliferation and effects of CSA through harmonisation of rules and increased coordination of efforts. This specific objective is of particular relevance to problem drivers 2 and 3. Coordination issues are at the core of the inefficiencies in public-private cooperation in problem driver 2, and improved coordination could boost Member States’ efforts on prevention and assistance to victims.
Contribution to relevant SDGs
The three specific objectives directly contribute to achieving the most relevant SDGs for this initiative, 5.2., eliminate all forms of violence against women and girls, and 16.2., end abuse, exploitation, trafficking and all forms of violence against children.
Specific objectives 1 and 3 also directly contribute to achieving other SDGs of relevance, such as SDG 1 on poverty and SDG 3 on health, by reducing the proliferation and effects of CSA and ensure the detection, reporting and removal on CSA online where it is currently missing. Contributing to prevent and/or stop the abuse can reduce the negative consequences on health, including mental health, which may have a negative impact on the economic future of the child (e.g. through substance abuse or decreased productivity). Specific objective 3 helps achieve SDG 4 on education (e.g. through the awareness raising campaigns or the exchange of related best practices facilitated by the EU Centre). Finally, specific objective 2 helps achieve SDG 9 on industry, innovation and infrastructure (e.g. as the initiative aims to support service providers efforts to fight against CSA online, including through increasing legal certainty and the required safeguards that do not hamper innovation on the technologies to detect, report and remove CSA online).
5.What are the available policy options?
5.1. What is the baseline from which options are assessed?
In the baseline scenario no further EU policy action is taken. The following section assesses the most likely scenario in the absence of the initiative, i.e. how the existing and already planned policy instruments would address the problems and objectives for EU action identified:
1.Legislation
Existing and upcoming EU legislation is not likely to effectively address challenges in detection, reporting and removal of CSA online and prevention of CSA, and assistance to victims. The proliferation of CSA online would be expected to continue in line with current developments. Specifically, the added value (i.e. what it can achieve in preventing and combatting CSA) and the limitations of the existing and upcoming EU legal instruments are the following:
Horizontal instruments
The GDPR:
·What it can achieve in the fight against CSA: online service providers have relied on legal bases in the GDPR for the processing of personal data required in relation to their voluntary activities to combat CSA online, e.g. under e.g. legitimate interest (Art 6(1)(f)) or vital interest (Art. 6(1)(d)) considerations.
·Limitations: the GDPR as a horizontal instrument does not contain CSA-specific provisions, i.e. provisions that explicitly allow or mandate the processing of personal data for the purpose of combatting CSA online.
The ePrivacy Directive and its proposed revision
·What it can achieve in the fight against CSA: the ePrivacy Directive and its proposed revision allow restrictions of certain rights and obligations under their scope, inter alia to prevent or prosecute CSA. Such restrictions require a proportionate legislative measure, under national or EU law. With the entry into force of the Interim Regulation, subject to compliance with a set of conditions, certain rights and obligations are temporarily limited (Articles 5(1) and 6(1) of the ePrivacy Directive for certain providers of online communications services), for the sole purpose of detecting and reporting CSA online and removing CSAM.
·Limitations: As horizontal instruments, the ePrivacy Directive and its proposed revision do not contain CSA-specific provisions. Member States are notably responsible for enforcement through their competent national authorities (see also Interim Regulation below).
The eCommerce Directive
·What it can achieve in the fight against CSA: with regard to hosting services, the eCommerce Directive is notably the basis for the notice and action mechanism in which parties such as users or hotlines notify online service providers of the presence of CSAM available in their services, so that it can be removed.
·Limitations: the eCommerce Directive does not contain CSA-specific provisions, i.e. provisions that explicitly enable or oblige online service providers to detect, report or remove CSA online. Furthermore, as noted, while failure to act expeditiously can lead to the hosting service providers not being able to invoke the liability exemption (and could thus be held liable under national law), there is no legal obligation upon the service providers to act, even when notified of manifestly illegal CSA.
The Digital Services Act
·What it can achieve in the fight against CSA: the DSA proposal, once adopted, will:
oprovide a horizontal standard of obligations for content moderation by providers of intermediary services; eliminate disincentives for these providers’ voluntary efforts to detect, identify and remove, or disable access to illegal content; and create obligations for them to provide information on their content moderation activities and on their users when requested by national authorities. These provisions are likely to encourage providers to implement voluntary measures and will also create more transparency and accountability for providers’ content moderation efforts in general;
ocreate due diligence obligations tailored to certain specific categories of providers (notice and action mechanism, statement of reasons, internal complaint-handling system, reacting swiftly to notices issued by trusted flaggers, notification of suspicions of criminal offences etc.) and transparency reporting obligations. In particular, it will oblige very large platforms to assess risks and implement the necessary risk mitigation measures on their services. These measures will encourage users and trusted flaggers to report suspected illegal content and providers to follow-up on these reports more swiftly. The obligations on very large platforms are also likely to contribute to lessening the prevalence of illegal content online and users’ exposure to such content;
oestablish rules on its own implementation and enforcement, including as regards the cooperation of and coordination between the competent authorities. This can lead to faster and more efficient content moderation efforts across the EU, including with regard to CSAM.
·Limitations. Due to its general and horizontal nature and focus on public-facing content, the DSA only addresses the issue of CSA partially. Its approach is appropriate for the wide range of heterogeneous illegal content for which the DSA sets the overall baseline, but it does not fully address the particular issues concerning the detection, reporting and removal of CSA online. Specifically:
oVoluntary detection: the DSA does not specify the conditions for the processing of personal data for the purpose of voluntarily detecting CSA online;
oMandatory detection: the DSA does not include any obligation to detect CSA online. Obligations to carry out risk assessments and take effective risk mitigating measures, as applicable, apply only to the largest online platforms, consistent with their general nature;
oReporting: although it contains some provisions in this respect, the DSA does not provide for a comprehensive CSA reporting obligation, since it focuses on cases where an offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place. Also, given the diverse nature of content that could be concerned, the DSA does not determine specific reporting requirements (i.e. what minimum information should the report contain) and does not provide for the involvement of a body like the EU Centre in the reporting process.
oRemoval: like the eCommerce Directive (see above), the DSA sets out liability exemptions that encourage removal, it but does not include any removal obligations.
In particular, while the DSA, once adopted, should show significant impact especially when it comes to publicly accessible content, its effect is likely to be less pronounced on content exchanged secretly and in non-public channels (e.g. in interpersonal communications), as is typical for the majority of CSA online. Considering this and the above limitations, the DSA will not eliminate the risks of legal fragmentation introduced by the national initiatives on combatting CSA online. These are likely to provide a more specific and targeted approach than the DSA, and partially targeting different services, in order to ensure an effective and targeted response to CSA online.
The Victims’ Rights Directive
·What it can achieve in the fight against CSA: as a horizontal instrument, the Victims’ Rights Directive covers the assistance, support and protection to all victims of crime. The CSA Directive contains additional specific rules that respond more directly to the specific needs of CSA victims.
·Limitations: the Victims’ Rights Directive refers to the need to cooperate with other Member States to improve the access of victims to the rights set out in the Directive but it does not contain specific mechanisms to do so. And, as mentioned above, this Directive does not address only CSA victims, for which dedicated mechanisms to facilitate the exchange of best practices, which take into account their specific needs, may be required.
Sector-specific legislation
The Child Sexual Abuse Directive
·What it can achieve in the fight against CSA: the CSA Directive focuses on defining the role of Member States and their public authorities in preventing and combating these crimes, and to assist victims. Specifically, the Directive defines criminal behaviour online and offline, sets the minimum level of maximum sanctions, and requires Member States to ensure adequate assistance and support to victims, as well as to put in place prevention measures.
·Limitations: as a criminal law instrument, the CSA Directive does not aim to regulate online service providers and so it does not provide sufficient specification of the role of service providers and the procedures to apply. In addition, the scope of the actual obligation (as a criminal law instrument) has to be limited to the own territory, which makes it a less effective tool given the global nature of the Internet.
The Interim Regulation
·What it can achieve in the fight against CSA: it makes it possible for providers of number-independent interpersonal communications services to continue or resume their voluntary measures to detect and report CSA online and remove CSAM, provided they are lawful and, in particular, meet the conditions set.
·Limitations: as a temporary measure with the aim of bridging the period until long-term legislation (that is, the present initiative) is put in place, it applies only for three years (until 3 August 2024) and does not establish a legal basis for any processing of personal data. The service providers within the scope of the Interim Regulation would therefore not be able to continue their voluntary activities when the Regulation ceases to apply. In addition, the Interim Regulation is not suitable to offer a long-term solution, since it only addresses one specific part of the problem, for a limited subset of services (number independent interpersonal communication services), and relies fully on voluntary approaches.
The Europol Regulation and its proposed revision
·What it can achieve in the fight against CSA: the revised mandate of Europol should enable Europol, in cases where private parties hold information relevant for preventing and combatting crime, to directly receive, and in specific circumstances, exchange personal data with private parties. Europol would analyse this data to identify all Member States concerned and provide them with the information necessary to establish their jurisdiction. To this end, Europol should be able to receive personal data from private parties, inform such private parties of missing information, and ask Member States to request other private parties to share further additional information. These rules would also introduce the possibility for Europol to act as a technical channel for exchanges between Member States and private parties. Such a development would contribute to increasing the level of cooperation between the three aforementioned stakeholders, potentially improving the effectiveness of CSA investigations.
·Limitations: in and of itself, the revised mandate of Europol will not contribute to a comprehensive solution to address CSA online, which requires a multi-faceted approach. Enabling a more efficient exchange of personal data between Europol and private parties is a necessary but not a sufficient condition for achieving this objective.
2.Coordination
EU level cooperation in investigations
·What it can achieve in the fight against CSA: the existing EU level cooperation in investigations has produced significant successes in the fight against CSA and will likely continue to do so.
·Limitations: the ability of Europol and law enforcement agencies in the EU to cooperate in investigations is limited by the resources that they can allocate to this crime area. For example, Europol has only been able to examine 20% of the 50 million unique CSAM images and videos in its database. The EU Centre could play an important role in supporting Europol in these tasks.
EU level cooperation in prevention
·What it can achieve in the fight against CSA: the network of experts on prevention will continue developing and adding more members, both researchers and practitioners, mostly from the EU but also globally, so that it can ultimately support Member States in implementing the prevention articles of the CSA Directive.
·Limitations: currently, the Commission services themselves are supporting the work of the network by coordinating its work and providing a secretariat. However, there are limits to the level of support these services can provide to the network, in particular as the network expands. The activities of the network could therefore be constrained to a level that would not allow it to reach its full potential of support to Member States.
EU level cooperation in assistance to victims
·What it can achieve in the fight against CSA: the Victims’ Rights platform would faciliate the exchange of best practices mostly on horizontal issues related to victims’ rights, and mostly on policy-related issues,
·Limitations: the focus on horizontal issues could limit the effectiveness of the platform for CSA victims, given the specificities of these crimes and their short- and long-term effects on victims.
Multi-stakeholder cooperation at EU and global level
·What it can achieve in the fight against CSA: at EU level, the EU Internet Forum (EUIF) has faciliated discussion between public authorities and online service providers in the EU in the fight against CSA at all levels, from ministerial to technical (see annex 8 for an example of output of technical discussions under the EUIF). It is expected that similar discussions continue in the future.
At global level, the WPGA has advanced countries’ commitment towards a more coordinated response to the global fight against CSA, based on global threat assessments, and a model national response. These have helped to clarify the challenges and assist member countries in setting achievable practical goals, and it is expected that they will continue to do so in the future.
·Limitations: at EU level, the focus of the EUIF is to faciliate targeted exchanges between public authorities and online service providers. The forum is not designed for discussions with a wider variety of stakeholders, including practitioners. Moreover, participation is voluntary and there are no legally binding obligations.
At global level, the EU will continue supporting global efforts through the WPGA. In the absence of a single European information hub, exchanges of expertise and best practices with leading centres worldwide (e.g Australian Centre to Counter Child Exploitation, NCMEC, Canadian Centre for Child Protection) will be limited. This will in particular concern initiatives on prevention and assistance to victims, leaving EU Member States to their own devices.
3.Funding
·What it can achieve in the fight against CSA: action using EU funding is likely to continue in the current project-based form, both as calls for proposals as well as research projects. EU-funded projects will continue to facilitate development of e.g. relevant IT tools for law enforcement and interventions aimed at preventing CSA and helping victims.
·Limitations: the current project-based efforts would be extended from grant to grant without long-term sustainability. Such long-term perspective may be supported by individual Member States with a national focus, but a comprehensive EU-wide approach and reinforced framework will continue to be lacking. The risk of projects duplicating existing efforts, will still be high; moreover, the update of successful projects will likely remain limited to participating countries.
***
In summary, the existence and magnitude of the problem suggests that the existing policy instruments in the fight against CSA (legislation, coordination and funding) are not sufficient to ensure an effective response:
·Legislation: the horizontal instruments (such as the eCommerce Directive, the ePrivacy Directive and its proposed revision or the DSA proposal) address some of the problems and challenges but, given the specific challenges of CSA, can only provide limited and partial solutions. The sectoral instruments (the CSA Directive, the Europol Regulation or the Interim Regulation) focus on particular aspects of the problem such as harmonisation of criminal laws or improving police investigations, which again by themselves are not able to provide a comprehensive EU-level solution. Also, none of these instruments define the role of service providers in combating child sexual abuse specifically enough to provide them with legal certainty and do not include effective obligations for the providers relevant to the fight against child sexual abuse.
·Coordination: inefficiencies persist despite the existing mechanisms, particularly in some areas of prevention and assistance to victims. The sharing of best practices and expertise between Member States is minimal and unsystematic. The current level of ambition and of collaboration between the various public and private stakeholders results in ad-hoc and temporary solutions and is rarely effective in addressing CSA. As a result, Member States have been facing difficulties in fulfilling some of their obligations under the CSA Directive, which ultimately means that prevention measures are not sufficient to protect children and stop offenders from committing crimes, and victims do not receive appropriate support.
·Funding: action using EU funding is mostly project-based, and the uptake of EU funding is not optimal. For example, some Member States do not always make use of the funds available to them to tackle CSA (e.g. through the Internal Security Fund national programmes), possibly due to lack of knowledge on what funding is available and where it could be applied. Projects that take place, either national or cross-border, run the risk of replicating what has already been done due to lack of coordination.
Considering the above, the most likely scenario in the absence of the initiative (long-term solution) would include the following:
·following the end of the period of application of the Interim Regulation (three years after its entry into force), and in the absence of other legislation of this kind at EU or Member State level, providers of number-independent interpersonal communications services would no longer be permitted to detect and report CSA, and would not be able to continue deploying their voluntary measures with the adequate safeguards protecting users’ fundamental rights, while the proliferation of CSA online would continue. As such service providers are currently the source of the majority of reports made by service providers
, the number of such reports (and therefore overall reports) could eventually decrease significantly;
·a similar drop in reports could be expected with the broader deployment of E2EE by default in these services;
·Member States’ law enforcement authorities would continue to receive the (fewer) reports through NCMEC, submitted by a small number of service providers and assessed in accordance with US law, which has different definitions of illegal content than EU law. The quality of the reports would remain at today’s levels;
·victims’ images and videos will continue to circulate online. Law enforcement authorities will be unaware of the undetected crimes and unable to identify and rescue victims and investigate and prosecute these cases;
·the full potential of the hotlines would remain underutilised as they would continue to lack a legal basis to search for CSAM proactively, despite the higher effectiveness compared to being totally dependent on users’ reports;
·without harmonised standards on the responsibilities and actions expected from service providers in the fight against CSA, their different approaches will fail to offer a reliable standard for the protection of users’ rights;
·the worsening situation would increase pressure on Member States to take action on a national level once the Interim Regulation expires to address the legal vacuum creating a risk of further fragmentation of the Single Market. A patchwork of national measures would not effectively protect children, given the cross-border and international dimension of the issues, and would create distortions in the functioning of the single market for digital services. While these will be partially addressed by the DSA, once adopted, a significant degree of fragmentation is expected to persist and possibly grow, given the manifestly illegal nature of CSAM and the specific channels for its dissemination and proliferation (see problem driver section 2.2.2.);
·without further EU facilitation of efforts, Member States’ action on prevention and assistance to CSA victims is not likely to significantly improve. The sharing of best practices between Member States will continue to be punctual and unstructured, an the current limitations in effectiveness of existing programmes are likely to persist, as well as the duplication of efforts.
Baseline costs
In the baseline scenario, no costs would be incurred by the creation and running of the Centre or any new organisation. However, the inefficiencies in the prevention, investigation and assistance to victims of child sexual abuse are expected to have a negative economic impact on society. A higher number of victims will experience a diminished quality of life, likely resulting also in productivity loss, and will require significant support, putting a strain on public services.
The economic impact on public authorities will depend upon the level of action taken by service providers, which will dictate the number of reports received by those authorities. The economic impact on service providers will depend on their level of engagement against these crimes. The existing legal fragmentation and legal uncertainty would remain and could act as a barrier to growth and innovation within the single market for digital services and hamper the fight against CSA. In the absence of a central hub fragmented efforts would continue, driving up the economic costs for individual entities.
As seen in box 4, the impact of CSA on its victims generates significant costs. Assuming similar costs and prevalence of CSA in the US as in the EU, adjusting for the larger population in the EU, the estimated annual CSA costs in the EU (and therefore the cost of no action) is EUR 13.8 billion.
5.2. Description of the policy options
In the determination of available policy options, three main considerations played a decisive role.
First, there are important rights at stake: on the one side, the rights of the child to be protected and the interest in preventing the circulation of CSAM as illegal content violating the intimacy and right to privacy of the victim; on the other side, the rights of all users especially to freedom of expression, privacy of communications and data protection. Naturally, the rights and interests of the providers, such as freedom to conduct business, are to be taken into account as well.
Second, offenders have proven savvy at moving to services that are less effective in detecting CSA online. Consequently, the policy options need to ensure an even application of the rules, in order to avoid simply pushing the problem off from one platform and onto another.
Third, more effective measures may not amount to imposing a general obligation on providers of intermediary services to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal activity. The Commission has recently confirmed its commitment to this principle, as reflected at present in Article 15(1) of the e-Commerce Directive and in Article 7 of the DSA proposal.
Box 9: prohibition of general monitoring obligations
·a service provider can in principle be ordered to take measures to detect and remove the item of defamatory content, even if it means monitoring the content provided by other users than the one who had initially posted the content;
·such an obligation can also be extended to content equivalent to the defamatory content, subject however to a number of conditions (only minor differences as compared to the defamatory content, sufficient specifications by the court issuing the order, no need for an independent assessment by the service provider).
|
All policy options that can be considered therefore need to meet a number of specific requirements in order to limit any interference with fundamental rights to what is strictly necessary and to ensure proportionality and compliance with the prohibition of general monitoring obligation:
·Obligations have to be targeted to those services which are at risk of being used for sharing CSAM or for grooming children.
·They have to strike an appropriate balance between the interests and (fundamental) rights associated with ensuring an effective approach to combating CSA and protecting children and their rights, on the one hand, and on the other hand the interests and rights of all users, including freedom of expression, privacy of communications and data protection, as well as avoiding an excessive burden on the service provider.
·To ensure that balance, they have to contain appropriate conditions and safeguards to ensure proportionality, transparency and accountability. Given the significant impact on fundamental rights, the effectiveness of the measures and of these conditions and safeguards should be subject to dedicated monitoring and enforcement mechanisms.
In line with the above requirements, the policy options assessed take a graduated approach, addressing the problem drivers from different angles and in various degrees, with an increasing level of obligations and intrusiveness. This cumulative logic was chosen because the measures that form the options not only are not mutually exclusive, but are also complementary, presenting synergies that the combined options can benefit from.
As a result, in addition to the baseline, five options are retained for assessment, as first presented in the intervention logic in table 1. The building blocks of these options are the retained policy measures that resulted from scoping and analysing the full spectrum of possible EU intervention, from non-legislative action to legislative action.
Figure 3 below shows how the measures combine to form the retained policy options:
Figure 3: overview of policy options and corresponding measures
The retained policy options were selected for their potential to contribute to creating a level playing field across the EU, lessening legal fragmentation, increasing efficiency in tackling the problem (e.g. by facilitating Member States action through sharing of expertise), and creating more balanced circumstances for all the affected providers, while also contributing to reducing their compliance and operational costs.
5.2.1. Option A: practical measures to enhance prevention, detection, reporting and removal, and assistance to victims, and establishing an EU Centre on prevention and assistance to victims
This option is non-legislative and includes practical measures to stimulate cross-sectorial cooperation among relevant stakeholders in prevention and assistance to victims, and enhance voluntary detection, reporting and removal of CSA online by relevant online service providers, within the boundaries of the existing legal framework (measure 1). This option also includes an EU Centre to support and facilitate information sharing on prevention and assistance to victims (measure 2).
1.Practical (i.e. non legislative) measures to enhance and support voluntary efforts of relevant information society service providers to detect, report and remove CSA online, and to enhance prevention and assistance to victims. Examples of practical measures to enhance detection, reporting and removal include developing codes of conduct and standardised reporting forms for service providers, improving feedback mechanisms and communication channels between public authorities and service providers, and facilitating the sharing of hashes and detection technologies between service providers. Examples of practical measures to enhance prevention and assistance to victims include facilitating research and the exchange of best practices, facilitating coordination, and serving as a hub of expertise to support evidence-based policy in prevention and assistance to victims.
2.EU Centre on prevention and assistance to victims.
This measure would create an EU-funded expertise hub, managed by the Commission with support from a contractor (similar to the Radicalisation Awareness Network, RAN). Among others, it would support Member States in implementing the relevant provisions of the CSA Directive (e.g. through expert workshops), and serve as a hub of expertise to support evidence-based policy and avoid duplication of efforts. It would also help develop and disseminate research and expertise, and facilitate dialogue among stakeholders. This would allow Member States to benefit from best practices and lessons learned in the EU and globally. Having both prevention and assistance to victims in the same hub would increase the possibilities for coherence and cross-fertilisation between both strands of work.
The purpose of prevention efforts led by the EU Centre would be to support Member States in putting in place tested and effective prevention measures that would decrease the prevalence of CSA in the EU and globally. The scope of these efforts would cover the two main types of prevention initiatives, i.e. 1) those that reduce the likelihood that a child becomes a victim (e.g. awareness raising and educational campaigns and materials for schools), and 2) those that reduce the likelihood that a person (re)offends. The Centre would facilitate Member States’ action on prevention by serving as a hub of expertise at the service of Member States, notably to help avoid duplication of efforts and to foster an evidence-based approach to prevention policies.
Under the lead of the EU Centre, a network of experts on prevention would facilitate the development of these efforts, the involvement of multiple stakeholders and the sharing of best practices and lessons learned across Member States. The network would enable a virtuous cycle of practice to research and research to practice, while enabling the cascading down of best practices and new developments from EU and global level to national and regional levels. The Centre would support the work of the network by e.g. hosting relevant repositories of best practices, providing statistics and other data relating to the prevalence of offending, offender profiles and pathways, and new crime trends particularly those relating to perpetrators’ use of technology to groom and abuse children.
The EU Centre will not have any power to impose any initiative on prevention to Member States, i.e. it will not coordinate in the sense of determining “which Member State is obliged to do what”. Its tasks in this respect will be ancillary to its principal tasks, which relate to the implementation of the detection and reporting processes.
With regard to assistance to victims, the Centre would play a similar role: facilitate the implementation of the practical measures on assistance to victims by serving as a hub of expertise to support the development of evidence-based policy and research on assistance to victims, including victims’ needs and the effectiveness of short and long-term assistance programmes. In addition, the Centre could provide resources to help victims find information on support that is available to them locally or online. The Centre would not provide assistance to victims directly when those services are already provided or would be best provided at national level, to avoid duplication of efforts. Also, the Centre would serve as a facilitator at the service of Member States, including by sharing best practices and existing initiatives across the Union. In that sense, it would facilitate the coordination of Member States’ efforts to increase effectiveness and efficiency. Similarly to prevention, the Centre will not have any power to impose any initiative on assistance to victims to Member States, including on issues concerning health, legal, child protection, education and employment.
The possibility to create an EU Centre on prevention and assistance to victims is further explored in Annex 10, as implementation choice A. As existing entities or networks cannot be expected to fulfil this role, a central entity is the most viable solution. The Centre could also help to improve the cooperation between service providers and civil society organisations focusing on prevention efforts.
5.2.2. Option B: option A + legislation 1) specifying the conditions for voluntary detection, 2) requiring mandatory reporting and removal of online child sexual abuse, and 3) expanding the EU Centre to also support detection, reporting and removal
This option combines the non-legislative option A with legislation to improve the detection, reporting and removal of CSA online, applicable to service providers offering their services in the EU. It would provide 1) a long-term regulatory framework for voluntary detection (measure 4); 2) put in place mandatory reporting in case CSA online is found (measure 5); and 3) set up an EU Centre to facilitate detection, reporting and removal of CSA online, as well as prevention and assistance to victims (measure 3).
1)Legal framework for voluntary detection of CSA online. This measure would build on and complement the DSA proposal, to address the specific challenges inherent in CSA that cannot be addressed with general systems building on notification by users and trusted flaggers as envisaged by the DSA, and provide a framework for relevant service providers to voluntarily detect CSA online, including known and new CSAM and grooming. It would replace the Interim Regulation, building on its safeguards in a more comprehensive framework, covering all relevant services, i.e. also those defined in the DSA and not only the electronic communications services within the scope of the Interim Regulation (i.e.. providers of instant messaging and email). The legal framework would provide increased legal certainty also when it comes to the basis and conditions for processing of personal data for the sole purpose of detection of CSA online.
Given in particular the impact on fundamental rights of users, such as personal data protection and confidentiality of communications, it would include a number of mandatory limits and safeguards for voluntary detection. These would notably include requiring service providers to use technologies and procedures that ensure accuracy, transparency and accountability, including supervision by designated national authorities. The legislation could set out the information rights of users and the mechanisms for complaints and legal redress.
Stakeholders’ views from the open public consultation on voluntary measures
The percentage of responses to the open public consultation from each of the main stakeholder groups that indicated that the upcoming legislation should include voluntary measures to detect, report and remove CSA online was the following: public authorities 25%, service providers 13%, NGOs 9%, and general public 10%. The support for voluntary measures was highest for known material and lowest for grooming (e.g. 11.3% for known material, 9.7% for new material and 6.5% for grooming in the NGO group).
2)Legal obligation to report CSA online. Relevant service providers would be required to report to the EU Centre any instance of suspected CSA that they become aware of, based on voluntary detection measures or other means, e.g. user reporting. This obligation would build on and complement the reporting obligation set out in Article 21 of the DSA proposal, covering the reporting of criminal offences beyond those involving a threat to the life or safety of persons (e.g. possession of CSAM). In order to enforce the reporting obligations, competent national authorities in the Member States would be designated. The legislation would also include a number of conditions (e.g. to ensure that the reports contain actionable information) and safeguards (e.g. to ensure transparency and protection of personal data, see section 5.2.3.).
Legal obligation to remove CSA online. As mentioned earlier, under the eCommerce Directive and the DSA proposal, hosting service providers are required to expeditiously remove (or disable access to) CSAM that they obtain actual knowledge or awareness of, or risk being held liable due to the resulting unavailability of the liability exemptions contained in those acts. Given that this system encourages but not legally ensures removal, it would be complemented by rules ensuring a removal obligation in cases of confirmed CSA online; where necessary, national authorities would be empowered to issue a removal order to the concerned providers requiring them to remove the specific CSAM on their services. The rules would be accompanied by the necessary conditions (e.g. to ensure that the removal does not interfere with ongoing investigations) and safeguards (e.g. to ensure transparency and protection of personal data and freedom of expression), including rules on redress. Member States’ national authorities would be competent for enforcement, relying where relevant also on the expertise of the Centre.
SMEs would also be required to report and remove in accordance with the above rules, benefiting however from additional support by the Commission and the Centre through:
·tools to facilitate the reporting and removal, made available by the EU Centre at no cost, for SMEs to use in their services if they wish, reducing their financial and operative burdens;
·guidance, to inform SMEs about the new legal framework and the obligations incumbent on them. This guidance could be disseminated with the help of industry associations; and
·specific training, delivered in collaboration with Europol and the national authorities.
3)EU Centre to prevent and counter CSA. The Centre would incorporate the supporting functions relating to prevention and assistance to victims of measure 2 and add the ability to support the detection, reporting and removal efforts, including by helping ensure transparency and accountability. Specifically, it would:
·facilitate detection by providing online services clear information on what is CSA in the EU through access to a database of CSA indicators (e.g. hashes, AI patterns/classifiers) to detect CSA in their services. The Centre would help create and maintain this database of indicators that would reliably enable the detection of what is defined as CSA according to EU rules (notably the CSA Directive), as determined by courts or other independent public authorities. The material would come from multiple sources including previous reports from service providers, concluded investigations by law enforcement, hotlines or direct reports from the public to the EU Centre (e.g. from survivors requesting the Centre for support to have materials depicting their abuse taken down). The Centre would also facilitate access (in particular to SMEs) to free-of-charge technology that meets the highest standards for the reliable, automatic detection of such content;
·facilitate reporting, by becoming the recipient of the reports of CSA concerning the EU that providers detect in their online services. The Centre would serve as an intermediary between service providers and other public authorities (notably law enforcement authorities), supporting the reporting process by 1) reviewing the reports to ensure that those other public authorities do not need to spend time filtering out reports that are not actionable and can make the most effective use of their resources; and 2) facilitating the communication between those other public authorities and service providers in case of requests for additional information from public authorities or requests for feedback from service providers (if needed);
·facilitate removal, by notifying in certain cases to the service providers materials considered to be known CSAM and requesting removal, as well as following up on these requests. This would entail supporting victims that request to have material that features them taken down; no such service exists to date. The Centre could also be given a mandate to conduct in certain cases searches of CSAM, using the databases of indicators. The Centre could track whether the removal has taken place. Where removal is not effected in a timely manner, the Centre could refer to national authorities for action (e.g. issuing of removal orders).
Box 10: distribution of tasks between the EU Centre and Member States
Prevention and assistance to victims: the Centre, although this would not constitute its principal task, it could, through the functions described in section 5.2.1., help facilitate Member States’ efforts in these two areas, notably to comply with their obligations under the CSA Directive. This initiative would not introduce new obligations on Member States on prevention and assistance to victims, including in relation to the cooperation with the Centre, which would remain an optional resource at the service of Member States that wish to benefit from it.
Detection, reporting and removal of CSA online: the Centre, through the functions described above, will also serve as a facilitator of Member States’ efforts on investigations, as well as a facilitator of service providers’ efforts to comply with the obligations under this initiative, particularly in relation to detection and reporting. The Centre would not have the capacity to initiate or conduct investigations, as these will remain under the responsibility of national law enforcement, or coordinate them, as this will remain under the responsibility of Europol. It will not be empowered to order service providers to remove CSAM, either.
Given the key functions above, the Centre would become a fundamental component of the legislation, as it would serve as a key safeguard, by acting both as the source of reliable information about what constitutes CSA online and as a control mechanism to help ensure the effective implementation of the legislation. The Centre would ensure transparency and accountability, by serving as a European hub for the detection, reporting and removal of CSA online. In receiving reports, the Centre would notably have visibility on the effectiveness of detection (including rates of false positives), reporting and removal measures, and on the spreading of CSAM and grooming across different platforms and jurisdictions.
Box 11: independence of the EU Centre
To be able to play its main role as a facilitator of the work of service providers in detecting reporting, and removing the abuse, and of the work of law enforcement in receiving and investigating the reports from service providers, it is essential that the Centre be independent
·from service providers, to be able to serve both as the source of reliable information about what constitutes CSA online, providing companies with the sets of indicators on the basis of which they should conduct the mandatory detection, and as a control mechanism to help ensure transparency and accountability of service providers; and
·from law enforcement authorities, as the Centre must be neutral to be an effective facilitator and must ensure that it maintains an objective, fair and balanced view.
To ensure that, it will be subject to periodic reporting to the Commission and to the public.
The Centre should also be independent from national public entities of the Member State that would host it, to avoid the risk of prioritising and favouring efforts in this particular Member State.
The Centre would also reduce the dependence on private organisations in third countries, such as NCMEC in the US, for the fight against CSA in the EU. The Centre would operate within the EU and under EU rules and would reduce the need for international transfers of personal data of EU residents to third countries, notably the US.
To be able to carry out its functions, specifically to support the process of detection, reporting and removal, the Centre would, in accordance with the EU’s personal data acquis, be provided with the appropriate legal basis to allow it to process personal data where needed. The Centre would be able to cooperate with service providers, law enforcement, EU institutions, but also with similar entities worldwide, such as NCMEC, given the global nature of CSA.
Discussion of the implementation choices for the Centre
This section summarises the process to determine the preferred implementation choice for the Centre, explained in detail in Annex 10.
The process had three stages: 1) mapping of possible implementation choices; 2) analysis of the choices and selection of the most promising ones for further analysis; 3) qualitative and quantitative analysis of the retained choices and determination of the preferred choice.
1) Mapping of possible implementation choices
Currently there is no entity in the EU or in Member States that could perform the intended functions for the Centre without significant legislative and operational changes, and therefore no obvious/immediate choice for the implementation of the Centre.
The process to determine the implementation choices started with a mapping of existing entities and their present functions and forms in order to identify possibilities to build on existing structures and make use of existing entities, or simple use them as possible references or benchmarks for setting up a new entity of the same type. For the mapping purposes, the examples were divided in two main types, depending on whether they required specific legislation to be set up:
1) entities that do not require specific legislation to be set up:
a)Centre embedded in a unit in the European Commission (DG HOME, e.g. Radicalisation and Awareness Network, RAN).
b)Entity similar to the EU centre of expertise for victims of terrorism.
2) entities that require specific legislation to be set up:
a)Centre fully embedded in an existing entity:
oEU body:
§Europol;
§Fundamental Rights Agency (FRA).
oOther:
§national entity (public or private such as an NGO);
§international entity (e.g. INHOPE network of hotlines).
b)Centre set up as a new entity:
oEU body:
§executive agency (e.g. European Research Executive Agency, REA, European Education and Culture Executive Agency (EACEA));
§decentralised agency (e.g. European Monitoring Centre for Drugs and Drug Addiction (EMCDDA), European Institute for Gender Equality (EIGE), European Union Intellectual Property Office (EUIPO)).
oOther:
§national entity:
·foundation set up under national law (e.g. Academy of European Law (ERA), set up under German law);
·Member State authority (e.g. new Dutch administrative authority to combat CSA and terrorist content online, under preparation).
§international entity:
·inter-governmental organisation (e.g. European Space Agency (ESA), European Organisation for the Safety of Air Navigation (EUROCONTROL));
·joint undertaking (public-private partnership, e.g. Innovative Medicines Initiative, Clean Sky Joint Undertaking);
·non-governmental organisation (e.g. CEN/CENELEC, EuroChild).
The mapping also included three relevant entities outside of the EU, which carry out similar functions to those intended for the EU centre, and which could provide useful references in some areas (e.g. costs, organisational issues, etc).
·US National Centre for Missing and Exploited Children (NCMEC);
·Canadian Centre for Child Protection (C3P); and
·Australian Centre to Counter Child Exploitation (ACCCE).
Finally, the mapping also included possible combinations of the above choices (i.e. functions distributed between several entities), in particular with Europol:
·Europol + a unit in the Commission;
·Europol + and NGO (e.g. a hotline);
·Europol + new national entity.
2) Analysis of the choices and selection of the most promising ones for further analysis
The analysis of the possible choices took into account the following criteria:
·Functions, i.e. the ability to effectively carry out the intended functions to contribute to achieving the specific objectives of the initiative. Specifically:
oFacilitate prevention efforts.
oFacilitate support to victims.
oFacilitate the detection, reporting and removal of CSA online, including by ensuring accountability and transparency.
·Forms, i.e. the form in which the Centre is set up, and the extent to which that form supports carrying out the intended functions. Specifically:
oLegal status: both the legal basis to set up the centre (if any) and the legislation to allow it to perform its functions (e.g. processing of personal data).
oFunding: the sources that would allow the centre to ensure long-term sustainability and independence of the centre, while avoiding conflict of interest.
oGovernance: it should ensure 1) proper oversight by the Commission, and other relevant EU institutions and Member States; 2) participation of relevant stakeholders from civil society organisations, industry, academia, other public bodies (in particular considering that the Centre would need to work very closely with Europol, the Fundamental Rights Agency, and national authorities); 3) ensuring independence and neutrality of the centre from overriding private and political interests, to be able to maintain a fair and balanced view of all the rights at stake and to play its main role as facilitator.
Each of the possible implementation choices mapped earlier was analysed according to the above criteria. This detailed analysis led to discarding a number of possible choices, in particular having the Centre fully embedded in Europol, notably due to:
·Challenges to carry out certain tasks in connection to the assistance to victims and prevention, particularly by acting as a hub for information and expertise, some of which are significantly different from the core law enforcement mandate of Europol. Adding these tasks would require a revision of the mandate and significant capacity building efforts, with the risk that these tasks are eventually deprioritised compared to the core tasks of supporting investigations. While Europol has an explicit empowerment to set up centres under Art. 4 of the Europol Regulation, these centres are of a different nature and refer to internal departments focusing on implementing Europol’s existing mandate in relation to specific types of crime. This empowerment therefore cannot be used to expand Europol’s mandate to cover the new tasks.
·Constraints of being part of a larger entity. Being part of a larger entity could limit the ability of the centre to dispose of its own resources and dedicate them exclusively to the fight against CSA, as it could be constrained by other needs and priorities of the larger entity. It may also limit the visibility of the centre, as child sexual abuse is only one of the many types of crime Europol deals with. Moreover, embedding fully the Centre in Europol could create an imbalance and it would be difficult to justify that Europol expands its mandate to cover prevention and assistance to victims only in the area of child sexual abuse. This could lead to Europol gradually deviating from its core law-enforcement mandate and covering prevention and assistance to victims in multiple crime areas, becoming a “mega centre” of excessive complexity to be able to attend to the specificities of the different crime areas adequately.
·Difficulties to appear as an independent and neutral facilitator. The intended main role for the Centre is to serve as a facilitator to both service providers and law enforcement authorities of the process of detection, reporting and removal of CSA online. Europol’s core mandate, however, is to support law enforcement. This may prevent Europol from appearing to all parties involved as an independent and neutral facilitator in the entire detection, reporting and removal process. Furthermore, service providers expressed during the consultations legal concerns about working too closely with law enforcement on the detection obligations, in particular if they are required to use the database of CSA indicators made available by the Centre for these detection obligations. There is a risk that that content data of CSA online (i.e. images, videos and text) could not be used for prosecution in the US. This is due to the US legal framework (US Constitution) preventing from using content data detected by companies acting as “agents of the state” as it could be the case if the companies were mandated to detect content data using a database of indicators (e.g. hashes/AI classifiers) provided by law enforcement rather than by a non-law enforcement entity.
Another choice that was discarded following analysis was setting up the Centre as a private law body under the national law of the Member State hosting it. The main reason is that the Centre would not be able to carry out effectively the function of supporting the detection, reporting and removal of CSA online. These tasks imply implementing EU law, which in principle only Member States or the Commission can do.
The detailed analysis of all the possible implementation choices resulted in three “legislative” choices (i.e. that require legislation to set up the Centre) retained for the final assessment:
1.Creating a self-standing, independent EU body (i.e. a dedicated decentralised agency) with all the intended centre functions: to support detection, reporting and removal of CSA online, and facilitate Member States’ efforts on prevention and assistance to victims.
2.Tasking Europol with supporting detection, reporting and removal of CSA online and creating an independent private-law entity (or tasking an existing one) for prevention and assistance to victims.
3.Tasking the Fundamental Rights Agency (FRA) with all functions.
3) Qualitative and quantitative analysis of the retained choices and determination of the preferred choice.
Qualitative analysis
1.Centre as a self-standing EU body (decentralised EU agency):
Arguments in favour:
·Independence, which would allow it to help ensure transparency and accountability of companies’ efforts to detect CSA online and serve as a major safeguard and a fundamental pillar of the long-term legislation. Independence is essential to the centre’s key function as facilitator and intermediary between private companies and public authorities. The legislation setting it up could be designed in a way that 1) guarantees the sustainability of the Centre through stable EU funding; 2) the governance is such that it ensures appropriate oversight by the Commission, and includes the participation of Member States and relevant stakeholders.
·Ability to dispose of its own resources, fully dedicated to the fight against CSA. Staff dedicated solely to the mandate of the Centre, rather than having to meet other objectives as part of a larger entity. Possibility to receive secured funding from the EU budget. Political accountability for its financial management would be ensured through the annual discharge procedure and other rules ordinarily applicable to decentralised agencies.
·Greater visibility of EU efforts in the fight against CSA, which would help facilitate the cooperation between the EU and stakeholders globally.
·Possibility to carry out all relevant functions in the same place (contribute to the detection of CSA online, support and assist victims and facilitate prevention) and liaise with all relevant stakeholder groups, which creates higher EU added value and a more effective and holistic response against CSA.
Arguments against:
·Annual costs would likely be slightly higher than in the other choices. These annual costs are indicative and could be higher or lower depending on the precise set-up and number of staff needed (see cost summary table in the quantitative assessment section below). The budget to cover this funding would need to be found within the scope of 2021-2027 Multiannual Financial Framework, from the Internal Security Fund budget.
·It will require significantly more time and effort to set up (including the decision on the seat of the agency) and get it fully operational as we cannot build on existing institutional legal frameworks (although these could serve as a reference) and would have to create a new mandate, and find, hire and train a number of dedicated non-law enforcement experts, including for management and control functions. The need for increased supervision would entail an increased workload at DG HOME and additional staff could be needed.
·The cooperation with Europol and national law enforcement would have to be created anew.
2.Part of the Centre within Europol and part as an independent entity:
Arguments in favour:
·The annual costs will most likely be lower than creating a new body as the Centre would benefit from economies of scale with Europol, (e.g. building, infrastructure, governance, management and control system), although building and governance costs could be offset by those of the new entity (see cost summary table below).
·The part of the Centre as part of Europol could directly benefit from its expertise and established mechanisms (including concerning personal data protection) to deal with the reports from service providers.
Arguments against:
·The ability of the Centre to serve as a major player and safeguard in the detection and reporting process, a key feature of the long-term legislation, would appear limited as it would not be independent from law enforcement.
·In the case of false positives, companies would be reporting innocent persons to law enforcement directly.
·The ability of the Centre to dispose of its own resources and dedicate them to the fight against CSA may be limited by other needs and priorities of Europol in other crime areas. This could also jeopardize its ability to deliver on these additional and visible tasks.
·Europol would be dedicating a substantial amount of resources to tasks such as manually reviewing the reports from companies to filter false positives, determining the jurisdiction best placed to act, etc. That may not be the best use of law enforcement’s resources, which could be otherwise dedicated to conduct investigations leading to the rescue of victims and the arrest of offenders, given the limited availability of law enforcement officers.
·Less visibility of EU efforts in the fight against CSA, as these would be split between two entities, and Europol’s area of focus is vast, which could limit its ability to facilitate the cooperation between the EU and stakeholders globally.
3.Tasking the Fundamental Rights Agency (FRA) with all functions:
Arguments in favour:
·Annual costs would most likely be slightly lower than creating a new body, as the centre could benefit from economies of scale with FRA (e.g. governance, management and control system). The initial costs would also be slightly lower than creating a new body or in the Europol+ option, thanks to the possibility to leverage the existing building and infrastructure (see cost summary table below).
·The focus of FRA on fundamental rights could reinforce the perception of independence, which is key to help ensure transparency and accountability of companies’ efforts to detect CSA online and of the outcome of the follow up of the reports by law enforcement. This would also allow FRA to serve as a major safeguard of the detection process.
·In the case of false positives, companies would not be reporting innocent persons to law enforcement directly.
·Possibility to carry out all relevant functions in the same place (contribute to the detection of CSA online, support victims and facilitate prevention) and liaise with all relevant stakeholder groups.
Arguments against:
·The ability of the Centre to dispose of its own resources and dedicate them to the fight against CSA may be limited by other needs and priorities of FRA. This could jeopardize its ability to deliver on these additional and visible tasks.
·Although it would be possible to build on the existing institutional framework to some extent, repurposing it may still entail significant effort to accommodate these new tasks in a long-existing and established entity.
·The setup of FRA and its governance structure are specific to its current mandate. Significant changes to that mandate and the governance structure would be required in order to integrate the EU Centre into FRA. Given past difficulties in revising the mandate of FRA, there would also be significant additional risks in reopening the relevant regulation.
·The cooperation with Europol and national law enforcement would have to be created anew.
·The annual and initial costs may be lower than creating a new body but they will still be substantial, e.g. to find, hire and train a number of dedicated non-law enforcement experts, and to carry out the centre functions (including manually reviewing the reports from companies to filter false positives, determining the jurisdiction best placed to act, and supporting Member States on prevention and assistance to victims).
·There would be a significant imbalance in FRA’s mandate: as it would double in size, half of it would be dedicated to CSA and the other half to its current tasks.
Quantitative analysis
Costs.
The following table summarises the estimated costs for the three retained implementation choices of the EU Centre:
Table 2: summary of estimated costs for the implementation options of the EU centre
|
1. EU body (e.g. agency)
|
2. Europol + separate entity
|
3. FRA
|
|
|
Europol
|
Separate entity
|
|
Staff
(number of people)
|
Detection, reporting, removal
|
Operational staff
|
70
|
70
|
N/A
|
70
|
|
|
Overheads staff
|
15
|
5
|
|
5
|
|
Prevention
|
Operational staff
|
10
|
N/A
|
10
|
10
|
|
|
Overheads staff
|
4
|
|
4
|
2
|
|
Assistance to victims
|
Operational staff
|
10
|
|
10
|
10
|
|
|
Overheads staff
|
4
|
|
4
|
2
|
|
Total staff (number of people)
|
|
75
|
28
|
|
|
|
113
|
103
|
99
|
Staff (MEUR/year)
|
15,9
|
10,6
|
3,9
|
13,9
|
|
|
14,5
|
|
Infrastructure (MEUR/year)
|
Initial costs
|
5
|
4
|
1
|
4
|
|
Annual costs
|
3,2
|
2,4
|
1,2
|
3,2
|
|
|
|
3,6
|
|
Operational expenditure (MEUR/year)
|
6,6
|
2,5
|
3,5
|
6,6
|
|
|
6
|
|
Total annual costs (MEUR)
|
25,7
|
15,5
|
8,6
|
23,7
|
|
|
24,1
|
|
Total initial costs (MEUR)
|
5
|
5
|
4
|
As a reference, existing agencies of comparable size have the following actual annual costs:
|
FRA
|
EMCDDA
|
Staff
|
Number of people
|
105
|
100
|
|
MEUR/year
|
14,7
|
12,2
|
|
People/MEUR
|
7,1
|
8,2
|
Infrastructure (MEUR/year)
|
2,2
|
2,1
|
Operational expenditure (MEUR/year)
|
7,4
|
4,7
|
Total (MEUR/year)
|
24,3
|
19
|
As indicated above, 28 posts corresponding to the prevention and assistance to victims functions in all options could be non-EU staff and be covered by a call for proposals/grant. In particular, in the case of option 2, Europol + separate entity, the possibility to cover these posts through a call for proposals/grant would not remove the need for a separate entity, as the envisaged prevention and assistance functions are currently not carried out by any organisation. Even if an existing entity applied for the potential call for proposals/grant, it would need to expand to accommodate the 28 posts, with the estimated infrastructure costs of e.g. rental of buildings, IT systems and audits, and the operational expenditure costs of e.g. support to expert networks, translation and interpretation, dissemination of knowledge and communication (see Annex 10, section 4.2.). Furthermore, a single separate entity should deal with both the prevention and assistance to victims functions to ensure organisational efficiency, given the strong interlinkages between both functions.
Annex 4 includes additional information on the points considered in the above estimates.
Benefits.
The main quantitative benefits derive from savings as a result of reduction of CSA associated costs, i.e. savings relating to offenders (e.g. criminal proceedings), savings relating to victims (e.g. short and long-term assistance), and savings relating to society at large (e.g. productivity losses).
It is assumed that the implementation choice that is the most effective in fulfilling the functions of the Centre would also be the one helping achieve that highest reduction of CSA and therefore the one with the highest benefits. Annex 4 contains estimates of these benefits, to be taken into account for the sole purpose of comparing the options. As it is expected that a dedicated EU agency would be the most effective in fulfilling the Centre functions, it would also be the one generating the highest benefits.
Preferred option
The analytical assessment and comparison process above indicates that the preferred implementation option for the Centre would be a dedicated EU decentralised agency. This is the option that would best contribute to achieve the specific objectives of the initiative, while respecting subsidiarity and proportionality and protecting fundamental rights. It will be possible to provide the EU agency with the necessary legal framework to carry out its functions, in particular those in relation to facilitating the detection, reporting and removal of CSA online.
The a dedicated and decentralised EU agency, in accordance with the common approach agreed by the European Commission, the European Parliament and the Council of the EU in 2012. As an EU agency, it would be financially independent and be funded by the EU, which would further support the Centre’s independence.
In addition to the periodic reporting to the Commission and to the public described above, the Commission and Member States would further supervise the Centre and its activities, in accordance with the general rules applicable to decentralised EU agencies. These rules include in particular a governance structure that supports both the independence of the agency and the participation of relevant stakeholders, notably through a management board with representatives of all Member States and the Commission, an executive board, and an executive director appointed following an open and transparent selection procedure.
In terms of organisation, the Centre would work closely with the European Police Agency (Europol), the EU Agency for Fundamental Rights (FRA) (e.g. in contributing to transparency and accountability as well as to assessments of the fundamental rights impact of new measures), national law enforcement and other relevant authorities, as well as the national hotlines. This setup would ensure that existing resources can be relied upon to the maximum extent possible while preserving the independence that is fundamental to the role of the Centre.
Box 12: relations between the Centre as a new EU agency and Europol
The Centre as a new EU agency would cooperate closely with Europol, in particular on facilitating the reporting of CSA online, as described above.
The Centre would be the recipient of the reports from service providers. It would review these reports and ensure that they are actionable, i.e. that they are not manifestly unfounded and could thus lead to law enforcement authorities to initiate an investigation where they deem this necessary and appropriate. In doing so, the Centre would ensure that possible false positives do not reach law enforcement and the service providers are informed of the possible errors. These tasks could free up resources at Europol and national law enforcement agencies, which are currently dedicated to filtering the reports.
Once the Centre confirms that the report is actionable, it would forward it to Europol and/or national law enforcement for action in accordance with the existing rules, including as regards Europol’s mandate. Europol could enrich with criminal intelligence the reports received from the Centre, identifying links between cases in different Member States, sharing the reports with national law enforcement agencies and supporting these agencies by facilitating cross-border investigations. The Centre would not have any competence to launch investigations; this would remain under the exclusive competence of national law enforcement authorities.
The Centre would also notably cooperate closely with Europol on the preparation of the databases of indicators, on the basis of which the service providers would be required to detect CSA online, building on existing databases at Europol and at national level. New material from reports (from service providers, hotlines and/or the public) and finished investigations by law enforcement will, where justified in view of confirmation by courts or independent administrative authorities, be added to these databases in the form of newly generated indicators, to ensure that they remain updated and as relevant as possible.
Box 13: European Parliament views on the EU Centre
The European Parliament has welcomed the idea to establish the European Centre to prevent and counter child sexual abuse that the Commission first announced in the 2020 EU strategy for a more effective fight against child sexual abuse, following the call of the Parliament in 2019 for an EU child protection centre that would help ensure an effective and coordinated response to child sexual abuse in the EU.
In addition, during the negotiations for the Interim Regulation, Members of the European Parliament repeatedly expressed their expectations that an EU Centre could help limit the international transfers of personal data of EU citizens to the US, hold companies accountable, and publish transparency reports about the detection, reporting and removal process.
Stakeholders’ views on the EU Centre to prevent and counter CSA
All the main stakeholder groups that responded to the open public consultation supported the creation of an EU Centre that would provide additional support at EU level in the fight against CSA online and offline, to maximize the efficient use of resources and avoid duplication of efforts. The support was highest among academia and research institutions (100% of responses), as well as public authorities and NGOs (85% of responses). 40% of the responses from service providers, business associations and the general public expressed explicit support.
More than half of the responses (51% of all responses to the consultation) indicated that the Centre could support Member States in putting in place usable, rigorously evaluated and effective multi-disciplinary prevention measures to decrease the prevalence of child sexual abuse in the EU. It could also support victims in ensuring removal of child sexual abuse material online depicting them. The Centre could serve as a hub for connecting, developing and disseminating research and expertise, as well as facilitating the communication and exchange of best practices between practitioners and researchers.
Public authorities pointed out that the Centre could maintain a single EU database of hashes of known CSAM in order to facilitate its detection in companies’ systems (76% of responses from this group). The Centre could also support taking down CSAM identified through hotlines (62% of responses from this group).
Service providers indicated in the targeted consultations that they would prefer to report to an EU Centre rather than to law enforcement directly, as they currently do in the US with NCMEC.
Stakeholders' views on new CSA legislation from the open public consultation
Respondents from public authorities (62% of the total responses from this group), companies (56%), business associations (60%) and civil society organisations (74%), supported new legislation to ensure legal certainty for those involved in the fight against CSA. In particular, the legislation should notably:
·provide the right incentives for the detection of CSAM;
·provide a clear legal basis for the processing of personal data to detect, report and remove CSA online;
·clarify and resolve conflicts and fragmentation in existing, pending and proposed legislation across Member States as well as at EU level; and
·be future-proof (i.e. that it remains effective despite future technological developments)
5.2.3. Option C: option B + mandatory detection of known CSAM
This option builds on option B and imposes on relevant providers an obligation to perform a risk assessment on whether their services are likely to be used for the sharing of known CSAM and propose mitigating measures to reduce that risk. Where the risk assessment (after proposing the mitigating measures) reveals a level of risk that is not minor, national competent authorities would issue orders to detect material that has previously been reliably confirmed by courts or other independent public authorities as constituting CSAM. These orders would be limited in time and would apply regardless of the technology used in the online exchanges, including whether the service is encrypted, to ensure that the legislation is technology neutral. The obligation to detect would be limited to relevant service providers in this context, i.e. those identified as the main vectors for sharing and exchange of known CSAM. Only a subgroup of the providers required to submit a risk assessment would receive a detection order, based on the outcome of the risk assessment taking into account the proposed mitigating measures. The legislation would list possible risk factors that the providers should take into account when conducting the risk assessment. In addition, the Commission could issue guidelines to support the risk assessment process, after having conducted the necessary public consultations.
Known CSAM is the most common type of CSA online currently detected (in 2020 service providers reported seven times more known images and videos than new ones, and 2600 times more known images and videos than grooming cases, see section 2.1.1.). The detection of new CSAM and grooming would remain voluntary, whereas reporting and removal (upon the reception of a removal order) would be mandatory for all types of CSA online, as described in option B. In order to ensure its effectiveness, effective and proportionate sanctions would be instituted for providers who fail to comply with the obligation. These sanctions would be imposed by Member States’ competent national authorities. More specifically, the process would look as follows:
Mandatory risk assessment
Relevant service providers would be required to assess the risk that their services are misused to distribute known CSAM. The risk factors to consider could include, depending on the service concerned:
·the business model of the service provider,
·its corresponding user base, including whether the service is available directly to end users (as opposed to, e.g., providing services to businesses),
·the verification of user identity in the registration process,
·the possibility to share images and videos with other users, e.g. by message or through sharing of a link to resources hosted on the service provided,
·in services offering a chat/messaging functionality, the possibility to create closed groups, which can be joined upon invitation from a member only,
·the way in which the services are designed and operated,
·the ways in which the services are actually used, and any corresponding impact on the risk of distribution of known CSAM,
·previous detection of CSAM on the service or on a similar service with a comparable risk profile.
As part of the risk assessment, the service provider could request support from the Centre and/or competent national authorities in performing detection tests on representative anonymised samples, in order to establish the presence or not of known CSAM.
Providers would then be required to report to the competent national authority on the risk assessment and on any mitigating measures that they plan to adopt or have already adopted. The competent national authority would review the risk assessment and determine whether the assessment has been properly conducted and whether the mitigation measures proposed by the service provider are sufficient. If needed, the competent national authority could request the service provider to resubmit the risk assessment or additional information pertaining to it.
Detection order
On the basis of this risk assessment and the criteria laid down in the initiative, the competent national authority would decide whether a detection order for known CSAM should be issued to each specific service provider, by a court or an independent administrative authority (which could be the national authority if it meets the independence criteria). A service provider falls under the jurisdiction of the Member State in which it has its main establishment or in which – if it has no main establishment in the EU – it has designated a legal representative, building on the approach already adopted in the Terrorist Content Online Regulation and proposed in the DSA. Competent national authorities would cooperate in a network to ensure harmonised application of the rules, building where possible on the structures to be put into place for the DSA. The detection order would be limited in time and renewable based on an updated risk assessment, and would be accompanied by specific supervisory powers for the authorities, including on the detection technology deployed, and by measures to ensure transparency. Suitable redress for affected service providers would be provided for.
Support by the EU Centre
The EU Centre would support service providers in three ways:
1)By providing practical or technical information to service providers that could help them giving effect to their legal obligations and contributing to the preparation of guidance and best practices documents where needed;
2)By making available to service providers a database of indicators of known material (e.g. hashes and URLs) that providers would be required to use to facilitate accurate detection of known CSAM. The indicators would correspond to material confirmed as illegal in the EU, as set out above.
In addition, the Centre would also facilitate access for service providers to free-of-charge detection tools. These tools would be automated and have a high accuracy rate, and have proven reliable for over a decade (see box 14 below and annex 8, section 1). Providers would not be mandated to use the tools provided by the Centre, as long as their tools meet the requirements (safeguards) specified in the legislation (see below). Responsibility for the use of these tools and any resulting decisions by the service providers would remain with the service provider themselves.
3)By reviewing the reports submitted by service providers to ensure accurate reporting to law enforcement, and providing support, including through feedback on accuracy, to further improve accuracy levels, to prevent imposing excessive obligations on the providers and in particular to avoid imposing the obligation to carry out an independent assessment of the illegality of the content detected.
The support of the Centre would be particularly useful to SMEs, which would also be subject to the above requirements and could thus also receive a detection order from national authorities. The Centre and the Commission could provide additional support to SMEs in the form of guidance, to inform SMEs about the new legal framework and the obligations incumbent on them. This guidance could be disseminated with the help of industry associations. It may also be possible to provide specific training, in collaboration with Europol and the national authorities.
Box 14: hashing and URL detection tools
Hashing is the most common technology to detect known CSAM. The most broadly used example is Microsoft’s PhotoDNA. It creates a unique digital fingerprint (‘hash’) of the image or video and compares it to a database containing hashes of material verified as being CSAM. If the hash is not recognised, no information is kept. The technology does not identify persons in the image/video and does not analyse the context.
·PhotoDNA has been in use for over 10 years by organisations globally, including service providers, NGOs and law enforcement in the EU. Its rate of false positives is estimated at no more than 1 in 50 billion, based on testing. Microsoft provides PhotoDNA for free, subject to a licensing agreement requiring strict limitation of use to the detection of CSAM. Organisations wishing to use the technology must register and follow a vetting process by Microsoft to ensure that the tool will be used by the right organisation for the sole purpose of detecting CSAM.
·Other examples of hashing technology used for these purposes, and operating on similar principles, include YouTube CSAI Match
, Facebook’s PDQ and TMK+PDQF
.
·The largest database of hashes is held by NCMEC, with more than four million hashes of CSAM images and 500 000 hashes of CSAM videos. Every hash contained in the database has been viewed and agreed upon as being CSAM by two experts at NCMEC on the basis of strict criteria (see Annex 8).
URL lists are also used to detect known CSAM. Currently they are typically prepared by national authorities (e.g. law enforcement, such as the National Centre for Combating Child Pornography in Italy, or the Judicial Police in France, OCLCTIC, supervised by the National Commission on Computing and Freedoms, CNIL, and supported by the national hotline Point de Contact) and transmitted to internet service providers to block access
. Some Member States (e.g. Bulgaria) use Interpol’s Worst of List (IWOL), which contains addresses with images and videos that depict severe abuse, with real children, younger than 13, and which have been verified by public authorities from at least two different countries or agencies.
Stakeholders’ views from the open public consultation on mandatory detection
Public authorities that responded to the consultation were in favour (81% of respondents) of mandatory detection, including in encrypted systems.
Some companies (31%) and business associations (40%) supported that such obligation shall not apply regardless of whether these services use encryption. Business associations also stressed the role of encryption in ensuring the online safety and confidentiality of communications of marginalised groups and groups at risk, and that encryption should not be weakened.
Children’s rights NGOs were in favour of mandatory detection also in encrypted systems, while pointing out that it should be in line with applicable privacy and other laws.
Privacy rights NGOs stressed the need of preserving strong encryption, and opposed all solutions identified to detect CSA in encrypted systems.
Individuals stressed that service providers should not be obliged to detect CSA online in encrypted services.
Conditions and safeguards
The obligation to detect known CSAM would apply regardless of the technology deployed in the online exchanges. As described in the problem definition (section 2.2.1.), some technologies used in online exchanges require adaptation of existing detection technology to detect CSA online: for example, while the principal methodology of comparing hashes would remain unchanged, the point in time at which identification is performed would need to be adjusted in end-to-end encrypted communications, to take place outside the communication itself. In addition, a number of companies have developed tools that seek to identify CSA online using metadata. While these tools are not yet comparable to content-based analysis tools in terms of accuracy, child protection and accountability, they could possibly develop to an equivalent standard in the future. Also, some providers have already deployed tools that perform content-based detection in the context of end-to-end encrypted communications, demonstrating the swift development of technologies in this area.
The legislative proposal should remain technology-neutral also when it comes to possible solutions to the challenge of preventing and detecting online child sexual abuse. Under this option, the obligation to detect known CSAM would therefore be an obligation of results, meaning that detection has to be of sufficient overall effectiveness regardless of the technology deployed. For example, in a test sample where a specified percentage of material constitutes known CSAM, the detection tool should correctly identify a comparable amount of CSAM, in line with the state of the art in detection technology when it comes to accuracy. This is to be demonstrated by the service providers. The legislation would set out conditions for the technologies deployed and corresponding supervision powers for national authorities, without however specifying the technologies that must be put in place to enable detection, to ensure that the legislation remains proportionate, technology neutral and future proof. Service providers would be free to implement the technical solutions that are most compatible with their services and infrastructures, provided they meet the standards (see below for details on standards).
The obligation to detect regardless of the technology used in the online exchanges is necessary to ensure not only that the services that, following the risk assessment, should be detecting known CSAM, can do so in practice, but also to prevent creating a negative incentive to put in place certain technologies solely to avoid the detection obligations. It would therefore ensure that the legislation achieves its general objective of improving detection, reporting and removal of CSA online.
The obligation to detect regardless of the technology used in the online exchanges, together with all the required safeguards (see below), is also necessary to help ensure a fair balance of the affected fundamental rights.
Box 15: Detection of CSA online in end-to-end encrypted communications
End-to-end encryption (E2EE) is an important example of a technology that may be used in certain online exchanges. While beneficial in ensuring privacy and security of communications, encryption also creates secure spaces for perpetrators to hide their actions, such as trading images and videos, and approaching and grooming children without fear of detection.This hampers the ability to fight these crimes and lowers the protection of the fundamental rights of the child and therefore creates a risk of imbalance in the protection of all the fundamental rights at stake. Any solution to detect CSA needs to ensure a fair balance between:
·on the one hand, the fundamental rights of all users, such as privacy and personal data protection, the freedom to conduct a business of the providers, and
·on the other hand, the objective of general interest associated with tackling these very serious crimes and with protecting the fundamental rights of children at stake, such as the rights of the child, human dignity, prohibition of torture and inhuman or degrading treatment or punishment, and privacy and personal data protection.
The Commission organised in 2020 an expert process under the EU Internet Forum to answer the following question: given an E2EE electronic communication, are there any technical solutions that allow the detection of CSA content while maintaining the same or comparable benefits of encryption (e.g. privacy)? Annex 9 summarises the work of experts from academia, service providers, civil society organisations and governments, which finished at the end of 2020. The expert group mapped the possible solutions and highlighted the most promising ones following a technical assessment across five criteria: effectiveness, feasibility, privacy, security and transparency. In relation to the question asked, the expert group concluded at the time that such technical solutions did exist at different levels of development, but had not been deployed at scale yet.
In August 2021, Apple announced the launch of its new ‘Child Safety’ initiatives, including on-device detection of known CSAM. This solution, similar to two of the solutions identified by the expert group as the most promising, appears to be a viable and technically mature solution to detect known CSAM outside the context of electronic communications, and regardless of whether or not any electronic communication is encrypted. In September 2021, Apple announced that the deployment of this solution would be delayed to gather additional feedback from customers, advocacy groups, researchers, and others before launching it, in view of criticism in particular from privacy advocacy groups. It has since deployed detection of images containing nudity sent or received by a child through on-device analysis on incoming and outgoing images, providing a warning to children not to view or send them. When sending or receiving such images, children have the option to notify someone they trust and ask for help.
Meta’s WhatsApp, which is end-to-end encrypted, has also been deploying tools to identify CSAM on its messaging service, based on unencrypted data associated with the communication. However, Meta has also acknowledged the limitations of its current detection tools in public government hearings, indicating that it expects lower numbers of detection compared to unencrypted communications, and has referred far fewer cases to NCMEC compared to Meta’s Facebook Messenger.
While companies would be free to decide which technology to deploy, the competent national authority will be empowered and required to supervise. If needed, it could make use of the technical expertise of the EU Centre and/or independent experts to determine relevant technical or operational issues that may arise as part of the authority’s assessment whether the technology that a given service provider intends to use meets the requirements of the legislation. In particular, the competent national authorities would take into account the availability of the technologies in their decision to impose a detection order, ensuring the effective application of the obligation to detect. In the cases in which the technology to detect CSA online was not yet available to be deployed at scale, the legislation could foresee for the competent authorities the possibility to consider this circumstance when deciding the start date of application of the detection order on a case by case basis. The EU Centre and the Commission could facilitate the exchange of best practices and cooperation among providers in the deployment efforts of new technologies.
The legislation would specify the necessary safeguards to ensure proportionality and a fair balance between all the affected fundamental rights. In particular, as service providers put in place technical solutions that allow the detection of CSA online regardless of the technology used in the online exchanges, there is a need to regulate the deployment of these solutions, rather than leaving to the service providers the decision on what safeguards to put in place.
Service providers have strong incentives already to ensure that all tools they deploy are reliable and as accurate as possible, to limit false positives. In addition, safeguards are of particular importance to ensure the fair balance of fundamental rights in the context of interpersonal communications, where the level of interference with the relevant fundamental rights, such as those to privacy and personal data protection, is higher compared to e.g. public websites.
The legislation would set out three types of safeguards, on 1) what standards the technologies used must meet, 2) safeguards on how the technologies are deployed, and 3) EU Centre-related safeguards. They would, as far as possible, build on the detailed safeguards of the Interim Regulation, to ensure coherence and minimise disruption. These safeguards could include or be based on:
1) Standards the technologies must meet:
·be in accordance with the state of the art in the industry;
·be sufficiently reliable in that they limit to the maximum extent possible the rate of errors regarding the detection of CSA, subject to independent expert certification;
·be the least privacy-intrusive, including with regard to the principles of data protection by design and by default laid down in the GDPR;
·not be able to deduce the substance of the content of the communications but solely be able to detect patterns which point to possible CSA (i.e. only determine whether the content matches known CSAM, without assessing or extracting anything else);
·make use of the indicators provided by the EU Centre to detect known CSAM (see below on EU Centre-related safeguards);
2) How the technologies are deployed, i.e. when deploying these technologies the providers should:
·conduct a prior data protection impact assessment and a prior consultation procedure as referred to in the GDPR, to be repeated when the technologies are significantly modified;
·establish internal procedures to prevent abuse of, unauthorised access to, and unauthorised transfers of, personal and other data;
·ensure human oversight, where necessary. While the tools for detection of known CSAM are accurate to such a high degree that human review of each and every hit is not required, the oversight should encompass spot checks and tests to ensure the continued reliability and verify consistent accuracy rates;
·establish appropriate redress mechanisms to ensure that users can lodge complaints with them within a reasonable timeframe for the purpose of presenting their views;
·inform users in a clear, prominent and comprehensible way:
oof the fact that the service providers use technologies to detect known CSAM and how they use those technologies;
owhich consequences such use may have for the users and avenues for redress related thereto;
·retain the content data and related traffic data processed for the purpose of detecting known CSAM and its subsequent actions (reporting, removal and possible other consequences, redress, responding to competent law enforcement or judicial authorities’ requests) no longer than strictly necessary for those purposes, and no longer than the maximum period defined in the legislation;
·give competent authorities access to data, solely for supervisory purposes; and
·publish transparency reports on how the technologies used have been deployed, including operational indicators such as error rates (see section 9 on monitoring and evaluation).
3) EU Centre-related safeguards. The Centre would be a fundamental component of the legislation and will serve as a key safeguard by:
·making available to service providers the indicators that they should use to detect known CSAM according to EU rules (notably the CSA Directive), as determined by courts and other independent public authorities (see description of EU Centre under option B);
·reviewing the reports submitted by the companies and contributing to ensure that the error rate stays at a minimum in particular by making sure that possible reports submitted by mistake by service providers (i.e. do not contain CSA online) are not forwarded to law enforcement, and providing feedback to service providers on accuracy and potential false positives to enable continuous improvement;
·facilitating access to free-of-charge technology that meets the highest standards for the reliable, automated detection of CSA online;
·publishing annual transparency reports which could include the number and content of reports received, the outcome of the reports (i.e. whether law enforcement took action and if so, what was the outcome), and lists of service providers subject to detection orders, removal orders and sanctions (see section 9).
Given the key role of the Centre, the legislation should also include a set of safeguards to ensure its proper functioning. These could include:
·carrying out an independent and periodic expert auditing of the databases of indicators and its management thereof;
·carrying out independent expert verification or certification of tools to detect, report and remove CSA online that the Centre would make available to service providers;
·creating clear and specific legal bases for the processing of personal data, including sensitive personal data, necessary for the performance of the Centre’s functions, with the appropriate limitations and safeguards;
In addition, as a decentralised EU agency, the Centre would be subject to all corresponding transparency and accountability obligations that generally apply to such agencies, including supervision by the EU institutions.
Stakeholders’ views on safeguards from the open public consultation
Public authorities indicated that it is critical to implement robust technical and procedural safeguards in order to ensure transparency and accountability as regards the actions of service providers.
NGOs pointed out that the new legislation should provide legal certainty for all stakeholders (e.g. service providers, law enforcement and child protection organisations) involved in the fight against CSA online and improve transparency and accountability. Almost 75% of views from NGOs underlined that transparency reports should be obligatory and standardized in order to provide uniform quantitative and qualitative information to improve the understanding of the effectiveness of the technologies used as well as about the scale of CSA online. Legislation could foster the development of an EU-wide classifications of CSAM.
Business associations highlighted that it is critical to publish aggregated statistics on the number and types of reports of CSA online received in order to ensure transparency and accountability regarding actions of service providers (40% of their replies). Moreover, some respondents (including companies and business associations) reflected that fully harmonised definitions (beyond the minimum harmonisation provided by the CSA directive) would help reduce EU fragmentation.
Academic and research institutions also stated that transparency reports should be obligatory, and evaluated by an independent entity (75% of their replies). All of them stated that these reports need to be standardized in order to provide uniform quantitative and qualitative information to improve the understanding of the effectiveness of the technologies used as well as the scale of child sexual abuse online.
5.2.4. Option D: option C + mandatory detection of new CSAM
This option is the same as option C but adding mandatory detection of material that has not been previously verified as CSAM (i.e. ‘new’, as opposed to ‘known’, CSAM). As described in section 2.1.1., the detection of new content (i.e. not previously identified as CSAM) often reveals ongoing or recent abuse and therefore implies a heightened need to act as soon as possible to rescue the victim.
As in option C, to ensure that the legislation is technology neutral, the obligation would apply regardless of the technology used in the online exchanges.
The detection of grooming would remain voluntary, whereas reporting and removal of confirmed CSA would be mandatory for all types of CSA online, as described in option B.
Mandatory risk assessment
Expanding the risk assessment outlined in Option C, service providers of relevant services, notably providers of interpersonal communication and hosting services, would be required to also assess the risk that their services are misused to distribute new CSAM. As there is no difference between “known” and “new” CSAM beyond its having been seen and confirmed by an authority, the distribution vectors are typically identical. Hence, risks and experiences relating to the detection of known CSAM could be taken into account in this regard. However, the risk factors would also take into account the specificities of new CSAM, and in particular the risk that the service is used to distribute self-generated material (see box 3 in the problem definition, section 2.1.1.). For interpersonal communications services, the risk assessment should also include an analysis of objective factors that may point to a heightened likelihood of sharing of CSAM, which could possibly include group size, gender distribution, frequency of exchange and frequency and volume of images and videos shared. In addition, the risk assesment could be based, e.g., on spot checks, particularly in the absence of previous experience on the same or comparable services.
The service providers would be required to report to the competent national authority on the risk assessment, including the mitigating measures that they plan to adopt or have already adopted, and the same considerations as in option C would apply.
Detection order
Similarly to option C, on the basis of this risk assessment, the competent national authority would decide whether a detection order for new CSAM should be issued to a service provider, for one or more relevant services it provides. The order should be limited to the strictly necessary; where possible and technically feasible, particularly for interpersonal communications services based e.g. on the objective factors identified in the risk assessment, it should be limited to relevant parts of a given service. The detection order would be limited in time and renewable based on an updated risk assessment. Suitable redress for affected service providers would be provided for.
Support by the EU Centre
The EU Centre would support service providers in three ways:
1)By making available to providers the database of indicators of new material (e.g. AI classifiers) that providers would be required to use to detect new CSAM, while ensuring a technology neutral approach. The indicators would be based on material determined by courts or other independent public authorities as illegal under EU law.
2)By making available to providers, free-of-charge, technologies to facilitate detection. Providers would not be mandated to use the technologies provided by the Centre and would be able to use other tools, as long as they meet the standards and provide for the safeguards specified in the legislation (see below).
3)By reviewing the reports submitted by service providers to ensure accurate reporting to law enforcement, and providing support, including through feedback on accuracy, to prevent imposing excessive obligations on the providers and in particular to avoid imposing the obligation to carry out an in-depth assessment of the illegality of the content detected, which can be relevant in particular in borderline cases. If possible CSAM is detected by the EU Centre, it will be added to the database of indicators of known CSAM only after public authorities have confirmed the illegality of the content. It could then also be used to improve the database of new CSAM indicators.
The support of the Centre would be particularly useful to SMEs, which would also be subject to the above requirements and could thus also receive a detection order from national authorities. the Centre and the Commission would provide additional support to SMEs in the form of guidance, to inform SMEs about the new legal framework and the obligations incumbent on them. This guidance could be disseminated with the help of industry associations. It may also be possible to provide specific training, in collaboration with Europol and the national authorities.
Box 16: technology to detect new CSAM
New CSAM often depicts ongoing abuse and therefore implies an urgency to act swiftly to rescue the child. Given the importance of this material, making its detection mandatory would ensure that more of it is detected and therefore more victims can be swiftly safeguarded.
The detection of ‘new’ content, as compared to that of known content through hashes, typically relies on an algorithm which uses indicators to rank the similarity of an image to images already reliably identified and hence identify the likelihood of an image or video constituting CSAM. While the patterns that the AI algorithm is trained to identify cannot be equated one to one to known material, they are similarly designed to identify equivalent content. The reliability of such tools, as with any algorithm, depends on the specificity of the content and availability of quality training data, i.e. content already reliably identified as CSAM. Given the large volumes of “known” CSAM, automated identification of new CSAM has had a good basis for development and would be rendered more effective through the continuous expansion of the database of known CSAM confirmed by independent authorities. In addition, as opposed to situations where context is of relevance and needs to be analysed (e.g. a slanderous expression reported on in a press article), the dissemination of CSAM is always illegal regardless of context. As a result, the challenge for automated detection is significantly lower in detecting what is often termed “manifestly illegal” content, compared to performing context-dependent assessments.
It is important to note that the process is similar to that for detection of known CSAM in that the classifiers are not be able to deduce the substance of the content of the communications but are solely able to detect patterns which point to possible CSAM. In other words, they are solely able to answer the question “is this content likely to be CSAM?”, yes or no, and they are not be able to extract any other information from the content such as identifying specific persons or locations (i.e. they ignore all other content information transmitted).
The detection of new content is in general more complex than the detection of known content. Due to the nature of new material, after it is flagged by software, it requires systematic human review to ascertain its potential illegality. The accuracy rate nonetheless lies significantly above 90% (see annex 8, section 2 for an industry example that can be set at 99.9%, which means that only 0.1% of the content automatically flagged is non-illegal). Annex 8 section 2 contains additional information on new CSAM detection technology.
Conditions and safeguards
As in option C, the obligation to detect new CSAM would apply regardless of the technology deployed in the online exchanges, and as an obligation of results, to ensure that the legislation remains technology neutral and as future proof as possible.
Also, as in option C, the competent national authorities, on the basis of the risk assessment conducted by the service provider (including mitigating measures adopted), and, if needed, in consultation with the EU Centre and its technical experts on the technologies deployed, would determine whether a detection order should be issued to a given service provider. They would remain competent to verify the compliance with conditions and safeguards and to supervise the tools deployed, in cooperation with data protection authorities and the EU Centre’s technical experts, where appropriate.
The legislation would specify the necessary safeguards to ensure a fair balance between all the affected fundamental rights. The safeguards could include all those described in option C extended to new CSAM, on 1) the technologies used, 2) how they are deployed, and 3) EU Centre-related safeguards. Given the high but comparatively lesser accuracy rates that detection tools for new content can have, the tools should be deployed in such a manner as to limit the number of false positives to the extent possible. The final determination of whether an image or video constitutes CSAM has to be made by a court or independent national authority. In addition, the material used to prepare and improve the indicators (AI classifiers) made available by the EU Centre could be subject to periodic expert auditing to ensure the quality of the data used to train algorithms.
5.2.5. Option E: option D + mandatory detection of grooming
This option includes the policy measures of option D and adds mandatory detection of grooming for certain providers of interpersonal communications services as the key vectors for online grooming. It would therefore comprise the mandatory detection of the three main forms of CSA online: known and new CSAM and ‘grooming’ (solicitation of children), limited to the service providers relevant for each of the types of content, which are different for grooming: while CSAM can be shared in various ways, such as by message, sharing links to image hosts or other means, grooming requires a direct communication channel between the offender and the child. Whereas known and new CSAM depict crime scenes of abuses already committed, grooming can indicate abuse that is ongoing and/or about to happen and which therefore could be prevented or stopped, protecting the child from harm.
As in options C and D, to ensure that the legislation is technology neutral, the obligation would apply regardless of the technology used in the online exchanges. Reporting and removal (upon the reception of a removal order) would be mandatory for all types of CSA online, as described in option B.
The services in scope in options C, D and E could be:
·for the risk assessment, reporting and removal obligations: relevant providers that provide or facilitate access to services enabling the dissemination of CSAM and grooming;
·for the obligations to detect known and new CSAM: a more narrow category of relevant service providers, in particular providers of hosting and interpersonal communication services;
·for the obligation to detect grooming: interpersonal communications services.
Mandatory risk assessment
Expanding the risk assessment outlined in options C and D, relevant service providers would be required to also assess the risk that their services are misused for grooming. Subject to further assessment, the risk factors to consider specific to grooming could include:
·the user base, including whether the service is available directly to end users (as opposed to, e.g., providing services to businesses),
·the verification of user identity in the registration process,
·whether the services are likely to be accessed by children or otherwise where children make up a significant proportion of a service’s user base;
·the existence of functionalities of the service enabling adults to search for other users of the service (including children), e.g. if the profiles are searchable by default to all users;
·the existence of functionalities of the service enabling adults to contact other users (including children), in particular via private communications, e.g. if private messaging is enabled by default to all users and if private messaging is an integral part of the service;
·whether the services enable sharing images and videos via private communications for all users;
·whether robust age verification measures are in place (in particular to prevent adults from pretending to be children);
·whether the service offers grooming reporting tools that are effective, easily accessible and age appropriate;
·past experience with grooming on the same or a comparable service.
The service providers would then be required to report to the competent national authority the risk assessment, including any mitigating measures that they plan to adopt or have already adopted.
Detection order
Similarly to options C and D, on the basis of this risk assessment, the competent national authority would decide whether a detection order for grooming should be issued to a service provider, for one or more of its services. Where it is possible based on the risk assessment and technically feasible to limit the detection to a part of the service, the order should be limited to what is strictly necessary: for example, to perform detection only in one-on-one exchanges as opposed to groups. This detection order would also be limited in time and renewable based on an updated risk assessment. Suitable redress for affected service providers would be provided for.
Support by the EU Centre
The EU Centre would support service providers in three ways:
1)By making available to providers the database of indicators of grooming (e.g. AI classifiers) that providers would be required to use to detect grooming, while ensuring a technology neutral approach. The indicators would be based on grooming cases determined by courts or other independent public authorities.
2)By making available to providers, free-of-charge, technologies to facilitate detection. Providers would not be mandated to use the technologies provided by the Centre and would be able to use other tools, as long as they meet the requirements and provide for the safeguards specified in the legislation (see below).
3)By reviewing the reports submitted by service providers to ensure accurate reporting to law enforcement, and providing support, including through feedback on accuracy, to prevent imposing excessive obligations on the providers and in particular to avoid imposing the obligation to carry out an independent assessment of the illegality of the content detected. If possible grooming is detected by the EU Centre, it could be used to improve the database of grooming indicators, after public authorities have confirmed the illegality of the content.
The above three-way support of the Centre would be particularly useful to SMEs, which would also be subject to the above requirements and could thus also receive a detection order from national authorities. The Centre and the Commission would provide additional support to SMEs in the form of guidance, to inform SMEs about the new legal framework and the obligations incumbent on them. This guidance could be disseminated with the help of industry associations. It may also be possible to provide specific training, in collaboration with Europol and the national authorities.
Box 17: technology to detect grooming
The detection of grooming, as compared to that of known content through hashes, typically relies on an algorithm which uses content indicators (e.g. keywords in the conversation) and metadata (e.g. to determine age difference and the likely involvement of the child in the communication) to rank the similarity of an online exchange to online exchanges reliably identified as grooming, and hence determine the likelihood of an online exchange to constitute grooming. The classifiers are not be able to deduce the substance of the content of the communications but are solely able to detect patterns which point to possible grooming. In other words, they are solely able to answer the question “is this online exchange likely to be grooming?”, yes or no, and they are not be able to extract any other information from the content such as identifying specific persons or locations (i.e. they ignore all other content information transmitted).
The accuracy rate lies around 90%, which means that 10% of the content automatically flagged for human review is determined by the reviewers as non-illegal). The detection of grooming is therefore also based on AI patterns/classifiers, like the detection of new CSAM, and in general more complex than the detection of known CSAM. Due to the nature of grooming, after it is flagged by software, it requires systematic human review to ascertain its potential illegality. In addition, the tools are constantly fed with data to continuously improve the detection process. Annex 8 section 3 contains additional information on grooming technology.
Despite the increase of grooming (see section 2.1.1.) and value of grooming detection to stop ongoing abuse and prevent imminent one, only one third of service providers that detect any form of CSA online detect grooming.
Conditions and safeguards
As in options C and D, the obligation to detect grooming would apply regardless of the technology deployed in the online exchanges, and as an obligation of results, to ensure that the legislation remains technology neutral and as future proof as possible.
As in options C and D, the competent national authorities would be given the necessary competences for effective oversight to determine whether conditions and safeguards are respected, also in terms of the deployment of technologies.
The legislation would specify the necessary safeguards to ensure proportionality and a fair balance between all the affected fundamental rights. The safeguards could include all those described in option C extended to grooming, on 1) the technologies used, 2) how they are deployed, and 3) EU Centre-related safeguards. In addition,
·the material used to prepare and improve the grooming indicators (AI classifiers) made available by the EU Centre could be subject to periodic expert auditing to ensure the quality of the data used to train algorithms;
·the service provider could be obliged to report back to the competent data protection authority on the measures taken to comply with any written advice issued by the competent supervisory authority for technologies to detect grooming, following and in addition to the prior data protection impact assessment and consultation;
·the technologies used to detect grooming should be limited to the use of relevant key indicators and objectively identified risk factors such as one-on-one conversations (as grooming very rarely takes place in a group setting), age difference and the likely involvement of a child in the scanned communication.
Stakeholders’ views on mandatory detection from the open public consultation
Public authorities indicated that mandatory detection of known (71% of responses) and new CSAM (57%), and grooming (48%) should be covered by the possible legislation.
Child rights NGOs were in favour of mandatory detection and removal of known (78% of responses) and new CSAM (61%), and grooming (51%).
Privacy rights organisations opposed any mandatory detection measures and stressed the need to respect the requirements of necessity and proportionality to ensure the respect of fundamental rights of users, also with regard to privacy and confidentiality.
Service providers expressed little support for imposing legal obligations to detect known CSAM (12.5% of responses), new CSAM (6%) and grooming (6%). They flagged that, if there are any obligations, they should be formulated in terms of best reasonable efforts at the current state of technology, be in line with other EU legislation (e.g. e-commerce directive and DSA), and should not impose an excessive burden on SMEs. They raised questions of conflict of laws between the US and the EU emerging from detection and reporting obligations.
Individuals that responded to the open public consultation also expressed little support for imposing legal obligations for service providers to detect known CSAM (20% of responses), new CSAM (14%) and grooming (13%). At the same time, there was general support for a possible role of EU Centre managing a single EU database of known CSAM to facilitate detection.
Box 18: YouGov survey on citizens’ views on online child protection and privacy
A recent survey carried out in eight Member States (DE, FR, IT, NL, PL, SE, ES, HU) in September 2021 in which nearly 9 500 adults participated found that:
·A majority (73%) of respondents believed that children are not safe online.
·Nearly 70% of respondents said they would support a European law to mandate online platforms to detect and report CSAM images and grooming, with technology scanning their photos and messages, even though this means giving up certain personal privacy.
·A majority of respondents (76%) considered detection of CSA online to be as or more important than people’s personal privacy online.
·Most respondents in the qualitative research groups did not know that hash detection tools to address online CSAM existed or that anti-grooming tools had been developed. Once participants learnt about these tools, “they were angry that they weren’t being used and turned on at all times”. Participants in these groups held to this view even when they were told that their data could be scanned to achieve this.
·A majority of respondents (68%) felt that there is not much, if any, privacy online vs 25% of respondents who believed that it does.
5.3. Measures discarded at an early stage
The process of building the retained options started with scoping the widest spectrum of measures and discarding a number of them along the way, which included notably:
·Indefinite continuation of the Interim Regulation, i.e. extending indefinitely the current period of application of three years. This measure was discarded because it would not address in a satisfactory way the problem drivers, in particular problem driver 1, concerning the insufficient voluntary action by online service providers, and 2 on the lack of legal certainty (the Interim Regulation does not establish a legal basis for any processing of personal data). Also, the Interim Regulation only covers a subset of service providers whose services affected by CSA online. The possible combination of this measure with other options (including the practical measures in option A) would not be able to address these fundamental shortcomings.
·Obligations to detect CSA online (known and/or new CSAM, and/or grooming) limited to technologies that currently make possible such detection (e.g. unencrypted services). These measures were discarded because the legislation would not be effective in achieving the general objective of improving the functioning of the internal market by introducing harmonised EU rules for improving identification, protection and support for victims of CSA. Moreover, rather than improving the fight against CSA online, these measures could worsen it, by unintentionally creating an incentive for certain providers to use technologies in their services to avoid the new legal obligations, without taking effective measures to protect children on their services and to stem the dissemination of CSAM.
Annex 10 contains a further analysis of discarded options for the Centre.
6.What are the impacts of the policy options?
6.1. Qualitative assessment
The qualitative assessment of the policy measures (which form the policy options), is available in annex 4, section 1. This section focuses on the qualitative assessment of the policy options retained for analysis. It analyses the most relevant impacts, i.e. social, economic and fundamental rights, in addition to those related to the UN SDGs. The consistency of the options with climate law, the ‘do no significant harm’ principle and the ‘digital-by-default’ principle was taken into account throughout the assessment where relevant.
6.1.1. Social impact
All proposed measures except the baseline scenario would improve, to differing degrees, the protection of online users, particularly the young and vulnerable, and enhance the ability of authorities to prevent and respond to cases of online CSA.
6.1.1.1. Option A: practical measures to enhance prevention, detection, reporting and removal, and assistance to victims, and establishing an EU Centre on prevention and assistance to victims
The practical measures to enhance voluntary detection, removal and reporting of online CSA would improve the prevalence and effectiveness of voluntary measures to some extent, and would increase the number of related reports and investigations. The measures would also likely improve the efficiency and quality of reporting from service providers to law enforcement authorities, and allow more efficient use of resources by both. Uncertainty as to the legal basis for the necessary processing of personal data would remain, leading to fragmented efforts.
Establishing an EU Centre that could perform certain tasks relating to prevention and assistance to victims would help facilitate coordination and the implementation of practical measures in these areas. While these measures would to some extent improve efficiency in public-private cooperation, a number of difficulties would remain, in particular regarding a reliable source of hashes, a single European reporting point, accountability and transparency regarding providers’ efforts, and the need for clear and comprehensive information on the prevalence of CSA online.
Finally, this option would likely not be sufficient in providing effective assistance to victims of CSA, or to prevent CSA. While the practical measures included here may facilitate dialogue and exchange of information, they would not be sufficient to support the implementation of a holistic, evidence-based approach. The Centre’s impact would be limited, as it would be supported by minimal resources and the support it could offer would be restricted. In particular in view of the significant impact of providers’ efforts on the wellbeing of children and the rights of all users, the resulting continuation of a patchwork approach would fall short of the objectives.
Therefore, this option would not fully address the problem drivers.
6.1.1.2. Option B: option A + legislation 1) specifying the conditions for voluntary detection, 2) requiring mandatory reporting and removal of online child sexual abuse, and 3) expanding the EU Centre to also support detection, reporting and removal
This option would specify the conditions for service providers’ voluntary detection, reporting and removal of online CSA, eliminating key obstacles to voluntary efforts by providing legal certainty. This would allow services within the scope of the ePrivacy Directive (and its proposed revision) to adopt or continue voluntary efforts, following the lapsing of the Interim Regulation in 2024, as well as other relevant services. The reporting obligation would ensure both swift investigations to identify offenders and, where possible, identify and rescue victims, as well as independent verification of the illegality of the content.
The removal obligation would help ensure that service providers that have become aware of the existence of CSAM in their services take it down swiftly. This would limit revictimisation and would contribute to prevention efforts, given the effect that viewing CSAM has on increasing the probability of future offending (see box 1).
These obligations would also help create a level playing field for relevant providers active in the EU, as they would all need to comply with one framework for the detection, reporting and removal obligations.
The creation of EU-level databases of indicators of CSA online would facilitate service providers' determination of what constitutes CSA online under EU law. By maintaining a single, reliable database in the EU of indicators to facilitate detection of CSA online in companies’ systems, the Centre would lead to significant improvements in the relevance of reports received by EU law enforcement authorities, reducing the number of reports of materials that do not constitute CSA online under the laws of the relevant Member State, and further eliminating erroneous removals. An increase in the volume of reports can be expected with the introduction of mandatory reporting and the creation of an EU database. Importantly, the database and the support provided by the EU Centre can be expected to contribute to an improved quality of reports. This in turn can be expected to result in greater numbers of victims rescued and of perpetrators identified, prosecuted and convicted. The consequential deterrence effects can support the prevention of future offending. The Centre would also act as a central point for reporting in the EU, supporting both service providers and hotlines, reducing the reliance on reports from third country organisations, and improving the ability of relevant authorities to respond to cases of online CSA also in particular across jurisdictions.
In addition, the Centre could facilitate, directly and in cooperation with hotlines, the removal of CSAM relating to a victim, at the request of a victim, by conducting searches and by notifying providers of content requesting it to be removed. In addition, the creation of a dedicated EU Centre would send an important message about the dedication of the EU as a whole to combating child sexual abuse more effectively and to ensuring that rules apply online as they do offline. It would place the EU at one level with those leading the fight against child sexual abuse worldwide, and would reduce dependence on third-country entities, both for operational reports and for strategic and horizontal information about threats and trends, areas where the EU and its Member States to date have very limited visibility. The social impact of the creation of an EU Centre to prevent and counter child sexual abuse is described further in annex 10, sections 4-6.
However, there are also some drawbacks to this option from the perspective of social impacts. As described in Section 2, experience has shown that service providers’ voluntary action by itself has been insufficient. Only 12% of service providers responding to the open public consultation on the DSA reported that they used automated systems to detect illegal content they host. This is reflected in the annual reports provided by NCMEC, which show that only a small percentage of providers registered to make reports to NCMEC have done so, that many of those who do make reports make very few of them, and that tools for the detection of CSA online are not widely used. Therefore, beyond ensuring that voluntary measures in interpersonal communications services can continue after the Interim Regulation expires, clarifications on the legal basis is unlikely to cause a significant increase in the use of voluntary measures.
Therefore, while option B would have a greater impact than option A through greater support for detection, reporting and removal efforts, it still would not fully address the problem drivers.
6.1.1.3. Option C: option B + mandatory detection of known CSAM
This option differs from Option B in two important aspects when it comes to its social impact. First, because it would introduce an obligation to detect known CSAM, and secondly because it would do so regardless of which technology is in use in the online exchanges.
The additional benefits of this option compared to Option B would be to ensure that the detection of known CSAM would no longer be dependent only on the voluntary action of providers. Detection would be focused on specific items of CSAM, which have earlier in an independent, reliable, specific and objective manner been found to be illegal. The detection would also be case-specific and limited in time, whilst assistance, safeguards and independent oversight would be provided for. Together with the aim of tackling particularly serious crimes, this all contributes to the conclusion that the obligation is in line with the prohibition on imposing general monitoring obligations. This option would also ensure that detection of known CSAM is performed regardless of the technology used. This would create a level playing field for relevant service providers, counteracting fragmentation and hence would have a positive effect on the realisation of the Single Market, building on the baseline harmonisation that the DSA is expected to provide.
In terms of the protection of children against the circulation of materials depicting their abuse, the obligation to detect is expected to have a positive impact. Over time, the overall number of images and videos depicting CSA available on services within scope should be reduced significantly, and, with it, the instances of secondary victimisation inherent in the continued viewing of the abuse. At the same time, it should entail a significant increase in the number of relevant service providers participating, in the volume of detection and reporting, and hence in the proportion of overall cases investigated and number of children identified and removed from abusive situations.
This would also have a positive impact on the overall confidence of users in services, as their exposure to CSAM would also be reduced. This positive impact would extend also to society’s expectation that services do not facilitate the sharing of CSAM. While the targeting of specific services would possibly somewhat reduce the overall effectiveness of the obligation which could be greater if more services were included in scope, this can be justified in light of the greater impact that such detection might have.
For the detection of known content, the availability of reliable indicators of what constitutes CSAM under EU law and of free-of-charge technologies facilitating automatic detection would support service providers in their identification of relevant content and help ensure proportionality of requirements. Known CSAM is the most common type of child sexual abuse online. The tools to detect it (see annex 8, section 1) have a high accuracy rate and have been reliably used for over a decade. The obligation to detect known material would level the playing field and ensure the detection of that content where is currently missing, with all the necessary safeguards. The EU Centre would make available the database of indicators of known material (e.g. hashes, URLs) that providers should use. The detection obligation might also encompass materials that victims have referred for detection and removal, or materials from concluded law enforcement investigations and that have been verified as CSAM by public authorities.
As a downside, such an obligation could result in occasional false positives, that is, in images and videos erroneously identified as CSAM. Given the gravity of an allegation of being involved in CSA, reporting could have a negative impact in the case of false positives and needs to be accompanied by safeguards ensuring that false positives are prevented as much as possible and that, where they occur, all data generated in relation to the false positives are erased, other than what is required for the improvement of automatic detection tools. Therefore, the Centre could provide an independent verification of the illegality of the content, eliminating manifestly unfounded reports, before forwarding reports that are not manifestly unfounded to Europol and national law enforcement authorities for action. Those authorities would, in addition, naturally still carry out their own assessments to determine whether further actions is necessary and appropriate in each individual case.
Given the impact on fundamental rights of all users, additional strict safeguards would apply, building on and going beyond those set out above for voluntary detection and for the reliability of the database of indicators. These could include independent expert auditing of the database of indicators and regular supervision and verification of the procedures of the Centre (with the involvement of data protection authorities as needed), independent expert certification of tools for automated detection to ensure accuracy, as well as additional transparency and accountability measures such as regular reporting. The legislation could also set out information rights of users and mechanisms for complaints and legal redress (see section 5.2.3.).
The application of an obligation regardless of the technology used in the online exchanges (including encryption) would ensure a level playing field regardless of service providers’ choice of technology and would likely significantly increase the effectiveness of the obligation. On the other hand, it could potentially limit the effective exercise of users’ right to privacy when it comes to the content of their communication and increases the burden on service providers as detection currently remains more challenging in E2EE communications. It is therefore only in light of the particularly egregious nature of CSA that such an obligation can be considered. This option would need to take into account the requirement of ensuring that the benefits of encryption for the privacy of all users are not compromised in the process of protecting children and identifying offenders. Technical solutions would therefore need to be carefully considered and tailored to balance these objectives. The obligation to detect would apply following a decision by the competent national authorities on a case by case basis, following the analysis of a risk assessment submitted by the service provider and taking into account technical feasibility.
The uniform application by all relevant online service providers to detect, report and remove known CSAM, regardless of the technology used in the online exchanges, would, over time, significantly affect the availability of CSAM on services falling within the scope of the initiative. It would decrease the blind spot caused by perpetrators’ use of certain technologies to share CSAM and abuse and exploit child victims. This would make private communications safer for children and help ensure that evidence of CSA can be found, leading to the identification of child victims.
6.1.1.4. Option D: option C + mandatory detection of new CSAM
The impacts of this option would be the same as option C, plus those of establishing a legal obligation for mandatory detection of new CSAM regardless of the technology used in the online exchanges.
The basic rationale for treating previously identified (i.e. known) and new CSAM the same is that both concern the same types of content, the difference between that the former has been independently confirmed as constituting illegal material under EU law whereas for the latter that has not (yet) occurred.
The additional challenge lies in the fact that detection of new CSAM relies on a different technology, which does not use hashes or URLs for individual images and videos but rather relies on pattern recognition, as set out in annex 8, section 2. The reliability and efficacy of such technologies is quite advanced, ensuring error rates in the low percentages, yet the burden on relevant service providers in ensuring the accuracy of efforts is significantly higher and would require an additional degree of human oversight and human confirmation of suspected CSAM.
Whereas the proportion of materials currently flagged as suspected new CSAM is significantly lower than that of known CSAM, new CSAM requires systematic human verification. The additional burden would need to be proportionate and compatible with the prohibition of general monitoring and active fact-finding as well as the need to strike a fair balance between the relevant fundamental rights at stake.
Such a balance may be supported by important objectives with respect to the interest of the child that would not otherwise be accomplished. Whereas the detection of known material reduces the re-victimisation of the child depicted in those images and videos and, at times, the investigation initiated with such a report may lead to uncovering ongoing abuses, this material depicts past abuse, which in some cases may be years old. By its nature, previously undetected CSAM usually depicts more recent and at times still ongoing abuse, provides particularly valuable leads, and is therefore treated as highest priority by law enforcement. The added value of detecting new CSAM in terms of the ability to identify and rescue children is significant. The positive social impact on children’s welfare consequently is significantly higher than in the case of detection of known content alone.
The prompt detection of new material also allows for prevention of its distribution, and the possibility of it ‘going viral’ in circles of abusers, by adding it to the databases of known material that feed the automated detection tools. The subsequent detection based on the comparison with these databases can also provide important information about the way in which CSAM is disseminated online and the circles of abusers, facilitating detection and effective action against such groups, which would have a significantly positive social impact of tackling the problem closer to its roots.
The application of an obligation to detect new CSAM regardless of the technology used in the online exchanges carries similar considerations as those laid out under Option C. It would ensure that obligations are applicable to all service providers regardless of choice of technology, which is likely to produce better effectiveness of the obligation to detect new CSAM. In particular, any solution used in this context would have to ensure both the benefits that encryption provides for privacy of all users and the protection of the fundamental rights of children. Solutions would need to be carefully considered and tailored to balance these objectives. This obligation is likely to increase the burden on service providers to deploy technical solutions that detect new CSAM in E2EE communications, including similar type of administrative burdens as to detection on new CSAM in un-encrypted communications to ensure accuracy, and mitigate error rates, including through human review.
Similarly to Option C, uniform application by all relevant online service providers to detect, report and remove new CSAM, regardless of the technology used in the online exchanges, would, over time, significantly affect availability of CSAM on services falling within the scope of the initiative.
6.1.1.5. Option E: option D + mandatory detection of grooming
The social impacts of this option would be the same as option D, plus those of establishing a legal obligation on relevant service providers for mandatory detection of grooming regardless of the technology used in the online exchanges.
Whereas the current number of reports of suspected grooming is significantly lower than that of CSAM, in particular known CSAM, grooming requires systematic human verification. The additional burden would need to be proportionate and compatible with the prohibition of general monitoring and active fact-finding as well as the need to strike a fair balance between the relevant fundamental rights at stake.
Such a balance may be supported by important objectives with respect to the interest of the child that would not otherwise be accomplished. Whereas the detection of known material reduces the re-victimisation of the child depicted in those images and videos and, at times, the investigation initiated with such a report may lead to uncovering ongoing abuses, this material depicts past abuse, which in some cases may be years old. In contrast, the identification and stopping of grooming is a measure that can serve to protect children from falling victim to imminent abuse, or to stop ongoing abuse. This is of particular relevance in the current situation in the pandemic, where children have been exposed to a significantly higher degree of unwanted approaches online including grooming. The positive social impact on children’s welfare consequently is significantly higher than in the case of detection of CSAM alone.
The detection of grooming typically relies on tools for automatic text analysis, which are trained on verified grooming conversations and assess a given exchange according to risk factors identified on the basis of the verified grooming cases. Such tools are at the moment slightly lower in accuracy than tools for the automatic detection of known or new CSAM (see box 16 in section 5.2.4.) and would therefore require additional conditions and safeguards to avoid reports of false positives. The comparably higher invasiveness of text analysis tools and lower accuracy rate therefore has to be weighed against the interest in more effective protection of the child, particularly in calibrating the tool to avoid false positives at the expense of increasing the number of false negatives. In addition, where detection can be limited to parts of a service, determined on the basis of objective factors, this further contributes to ensuring the appropriate balance.
6.1.2. Economic impact
The assessment of the economic impact of the different options focuses on the impact on service providers and public authorities concerned by the measures.
The quantitative assessment is included in section 6.2. For a detailed assessment of the economic impact of establishing the Centre see annex 10.
6.1.2.1. Option A: practical measures to enhance prevention, detection, reporting and removal, and assistance to victims, and establishing an EU Centre on prevention and assistance to victims
Compared to the baseline scenario, the practical measures to enhance the voluntary detection, removal and reporting of CSAM would to some extent improve the quality of procedures and the cooperation between the private and public sector. In particular, the training of EU practitioners and the sharing of guidelines and best practices should have a positive impact and generate efficiency savings both for providers and for public authorities.
The practical measures to enhance actions on prevention and assistance to victims, including establishing an EU Centre as a hub without legal personality, would generate limited costs to the EU budget. They would have a potential to limit expenses on the side of the Member States, which could make use of existing research and expertise. The Centre’s activities in the areas of prevention could lead to a reduction in relevant offences, while its victim support role could contribute to the recovery of victims, reducing the long-term impact of these crimes on victims and society. In all areas, the Centre’s work could reduce duplication of efforts. However, this positive impact would be limited and would depend on the willingness of actors to cooperate.
The practical measures addressed to authorities to improve cooperation with service providers (training, standardised forms, online portal) would generate some moderate costs for them, but also improve the quality of reports and should therefore lead to a net reduction of costs for both service providers and public authorities. Likewise, the set-up of a feedback mechanism and communication channel would cause some moderate integration and maintenance costs but the benefits of such mechanism are expected to outweigh the expenses.
The practical measures addressed to service providers (streamlining of policies) would similarly generate moderate costs for them, in particular if changes to procedures have to be implemented, but public authorities would have a clear point of entry, reducing transaction costs, and would not have to adapt to a variety of individual service providers' policies, leading to cost reductions for public authorities. The Application Programming Interfaces (APIs) that public authorities could make available to allow service providers to remotely check hashed images and videos from their service against databases of hashes would generate moderate integration and maintenance costs for relevant public entities. However, as mentioned above, using common APIs would reduce transaction costs and overall costs in the long-run.
Supporting measures, technology and expertise sharing across platforms could limit potential economic burdens on relevant online service providers. Similar to service providers, the public sector would also benefit from interoperable tools and increased cooperation. There will also be a positive economic impact on expenses related to victim support.
6.1.2.2. Option B: option A + legislation 1) specifying the conditions for voluntary detection, 2) requiring mandatory reporting and removal of online child sexual abuse, and 3) expanding the EU Centre to also support detection, reporting and removal
The economic impacts of this option are the same as in option A, plus those of clarifying the legal basis for the voluntary detection of CSA by relevant online service providers, a reporting and removal obligation, and the cost of establishing and maintaining an EU Centre.
Reporting obligations under this option could lead to:
·additional costs to law enforcement authorities, to adequately respond to the likely increase in reports from service providers. Furthermore, if law enforcement receives more reports where action is required due to more extensive and reliable datasets provided by the Centre, additional costs could be expected concerning identification of victims and offenders, investigations, criminal proceedings and support to victims and their families;
·additional costs to service providers, e.g. in technological developments and/or acquisition and maintenance, infrastructure expenditure and expert staff recruitment and training, in particular with regard to SMEs.
For both the public and the private sector, administrative and compliance costs could arise from implementing new legislation. On the other hand, the economic impact of (voluntary) earlier detection of CSA would be expected to be significantly positive with regard to the quality of life of survivors, their productivity, and reduced costs of lifelong victim support. In addition, a positive effect on the Single Market could result from additional legal clarity and certainty, thus limiting compliance costs.
Establishing an EU Centre would incur significant cost to the EU budget. However, the Centre would also contribute to limiting expenses for other stakeholders, including public authorities and service providers, by streamlining activities in an economic manner. The Centre’s activities would support both law enforcement authorities and online service providers in the detection and reporting of CSA online, leading to greater efficiencies. It would facilitate compliance and reduce the costs of complaints and associated judicial proceedings by making available reliable information on content that is illegal in the EU. The Centre would also help streamline and facilitate hotlines’ efforts, including with regard to proactive searches. In addition, more extensive and reliable datasets of e.g. hashes would help law enforcement prioritise their actions, reducing the time spent filtering out non-actionable reports. The Centre’s activities in the area of prevention could lead to a reduction in relevant offences, while its victim support role could contribute to the recovery of victims, reducing the long-term impact of these crimes on victims and society. In all areas, the Centre’s work could reduce duplication of efforts. In the long run, the Centre’s activities would therefore lead to a decrease in the economic costs of CSA.
6.1.2.3. Option C: option B + mandatory detection of known CSAM
The impacts of this option are those outlined for option B plus those derived from the obligation to detect known material. For both the public and the private sector, administrative and compliance costs would arise from implementing new legislation.
For service providers, the introduction and maintenance of systems for the detection, where applicable, and the new or increased generation of reports would result in costs, also in relation to follow-up requests for further relevant data from public authorities, and for handling complaints and requests for review by affected users. However, they would benefit from the fact that this option would limit further fragmentation of the Internal Market with regard to administrative procedures and obligations required from hosting service providers. A number of service providers could build on systems they already have in place. In addition, the Centre would provide important support in making available technologies that can then be adapted to the needs of the providers. Technologies for the detection of known CSAM have been available free of charge for years and have proven their reliability.
SMEs offering hosting services are particularly vulnerable to exploitation through illegal activities, including CSA, not least since they tend to have limited capacity to deploy state-of-the-art technological solutions to detect CSAM or specialised staff. Therefore, while they should not be exempted from any rules and obligations, it is of particular importance to ensure that measures are proportionate and do not place an undue burden on them. The free availability of reliable databases of known CSAM indicators as well as detection tools (made available by the Centre) are important in this regard. Even though companies may have unequal resources to integrate technologies for the detection of CSAM into their products, this negative effect is outweighed by the fact that excluding them from this obligation would create a safe space for child sexual abuse and therefore defeat the purpose of the proposal. To further mitigate the economic impact on smaller companies, the verification of the illegality of the reported material could be left to the expertise of the EU Centre, in cooperation with the national authorities and the network of hotlines where needed and appropriate, which would inform the provider whether the material did in fact constitute CSAM. Therefore, these service providers would not be forced to invest in additional human resources for confirmation of suspected CSAM.
The expected increase in reports from service providers would result in significant additional costs to public authorities, in particular law enforcement and judicial authorities, arising from the corresponding increase in investigations and prosecutions. However, this financial impact is expected to be outweighed by the positive economic impact on victim support measures and survivor quality of life and productivity.
A positive effect on the Single Market could result from additional legal clarity and certainty, thus limiting compliance costs. Furthermore, both the public and the private sector would benefit from a common framework creating more legal certainty and mutual trust between the public and the private sector.
6.1.2.4. Option D: option C + mandatory detection of new CSAM
The impacts of this option are those outlined for option C plus those derived from the obligation to also detect new material. For both the public and the private sector, administrative and compliance costs would arise from implementing new legislation. However, all of the legislative options could reduce the fragmentation of the Internal Market and reduce compliance costs on the long term.
The expansion to new material could further increase the workload of law enforcement, compared to the previous option. While the overall number of new materials detected is expected to be lower than that of known CSAM, it will likely still be significant, considering that the cases require urgent and detailed attention, given the greater likelihood of ongoing abuse and the need for victim identification. Therefore, this increase in the workload will be accompanied by additional costs to respond to reports, costs related to starting investigations as well as the criminal justice process.
As in option C, service providers could encounter additional costs related to the integration and maintenance of detection technology and follow-up requests from public authorities, among others. Expanding the safety policy to new CSAM might require service providers to invest in adapting the available technologies to their individual products and possibly in recruiting trained staff to verify new material before reporting it. This could affect smaller providers in particular. To mitigate this effect, technologies would be made available free of charge. In addition, in the case of SMEs the human review and verification would be left to the expertise of the EU Centre which, in cooperation with national authorities and the network of hotlines where needed and appropriate, would inform the provider whether the material constituted CSAM.
6.1.2.5. Option E: option D + mandatory detection of grooming
The impacts of this option are those outlined for option D plus those derived from the obligation to also detect grooming.
Expanding the obligation to detection of grooming would require relevant service providers to invest in integrating additional tools to detect this type of abuse. These costs could be mitigated by making available technologies free of charge via the EU Centre, limiting service providers’ expenses to the integration of such tools into their services, and by relying on the EU Centre for the confirmation of cases identified as suspected grooming. By contrast, staffing costs for the Centre would increase as such cases require immediate reaction in order to ensure the protection of victims. Where the relevant service providers choose to rely on the Centre for verification before taking action, swift turnaround would have to be ensured in order to inform the provider about the need to intervene in an interaction and to protect a child.
Law enforcement would incur higher costs related to processing reports, compared to option D. The number of additional reports is expected to be lower compared to known CSAM, but as for new CSAM, swift action is required to protect the victim. The same considerations on administrative costs for the implementation of legislation as set out above apply. The positive economic impact when it comes to victim support and quality of life would increase, as the number of children that do not fall victim to hands-on child sexual abuse because of the timely detection of grooming would increase. This could potentially reduce the impact on victim support systems, compared to the previous options, as well as having a decisive impact on the quality of life and future productivity of the children.
Stakeholders’ views on economic impacts
Service providers and business associations expressed in the open public consultation and the inception impact assessment their concerns regarding the economic impact for SMEs of possible legal obligations and that a ‘one-size-fits-all’ solution should be avoided. They also pointed out that the costs of deploying and maintaining technical solutions should not be underestimated.
Hotlines and public authorities indicated in the open public consultation and in the targeted consultations that increased reporting could result in increased costs for investigating, prosecuting, and managing offenders, and in assistance and support to victims.
6.1.3. Fundamental rights impact
According to Article 52(1) of the Charter of Fundamental Rights, any limitation on the exercise of the rights and freedoms recognised by the Charter must be provided for by law and respect the essence of those rights and freedoms. Subject to the principle of proportionality, limitations may be made only if they are necessary and genuinely meet objectives of general interest recognised by the Union or the need to protect the rights and freedoms of others.
The objective pursued by the envisaged proposal, i.e. preventing and combating CSA, which is a particularly serious crime, constitutes an objective of general interest within the meaning of Article 52(1) of the Charter. In addition, the proposal seeks to protect the rights of others, namely of children. It concerns in particular their fundamental rights to human dignity and to the integrity of the person, the prohibition of inhuman or degrading treatment, as well as the rights of the child. It takes into account the fact that in all actions relating to children, whether taken by public authorities or private institutions, the child's best interests must be a primary consideration. Furthermore, the types of CSA at issue here – notably, the exchange of photos or videos depicting the abuse – can also affect the children’s rights to respect for private and family life and to protection of personal data. In connection to combating criminal offences against minors the European Court of Justice has noted that at least some of the fundamental rights mentioned can give rise to positive obligations of the relevant public authorities, requiring them to adopt legal measures to protect the rights in question.
At the same time, the envisaged measures affect, in the first place, the exercise of the fundamental rights of the users of the services at issue. Those rights include, in particular, the fundamental rights to respect for privacy (including confidentiality of communications, as part of the broader right to respect for private and family life), to protection of personal data and to freedom of expression and information. Whilst of great importance, none of these rights is absolute and they must be considered in relation to their function in society. As indicated above Article 52(1) of the Charter allows limitations to be placed on the exercise of those rights, subject to the conditions set out in that provision.
More specifically, the measures aim to achieve the aforementioned objective by regulating both ‘public-facing’ and ‘private’ services, including interpersonal communication services, which results in varying levels of intrusiveness regarding the fundamental rights of users. In the case of content that is accessible to the public, whilst there is an intrusion, the impact especially on the right to privacy is generally smaller given the role of these services as ‘virtual public spaces’ for expression and economic transactions. The impact on the right to privacy in relation to private communications will generally be greater. Such impact, where necessary to achieve the aforementioned objective, must be necessary and proportionate and be moderated by appropriate safeguards. The safeguards have to be differentiated and balanced in order to adapt inter alia to the varying level of intrusiveness depending on the nature of the communications services at issue.
Furthermore, the potential or actual removal of users’ content, in particular erroneous removal (on the mistaken assumption that it concerns CSAM), can potentially have a significant impact on users’ fundamental rights, especially to freedom of expression and information where content is removed erroneously. Such impact can depend inter alia on the service provider’s position in the Internet ‘stack’. Services lower in the Internet stack include those providing cloud infrastructure, web hosting, or content distribution network services. At the same time, content involving CSA that is left unremoved can have a significant negative impact on the aforementioned fundamental rights of the children, perpetuating harm for children and for society at large. Other factors to be taken into account in this regard include the nature of the user content in question (text, photos, videos), the accuracy of the technology concerned, as well as the ‘absolute’ nature of the prohibition to exchange CSAM (which is in principle not subject to any exceptions and is not context-sensitive).
In addition, the freedom to conduct a business of the providers covered by the proposal comes into play as well. Broadly speaking, this fundamental right precludes economic operators from being made subject to excessive burdens. It includes the freedom to choose with whom to do business and the freedom of contract. However, this right is not absolute either; it allows for a broad range of interventions that may limit the exercise of economic activities in the public interest.
The need to strike a fair balance between all of the fundamental rights at issue played an important role in the consideration of the various options. The initiative may not affect the essence of, or affect in an unjustified and disproportionate manner, the abovementioned fundamental rights. The options were pre-selected accordingly, and the main differences between the options relate to the extent of their effectiveness in safeguarding and balancing the various fundamental rights, considering their various degrees of interference, and the ability of the options to offer a more adequate response in light of both the current and the evolving risks emerging in a highly dynamic digital environment.
6.1.3.1. Option A: practical measures to enhance prevention, detection, reporting and removal, and assistance to victims, and establishing an EU Centre on prevention and assistance to victims
Compared to the baseline scenario, a limited positive impact on fundamental rights may be expected with respect to better coordination of efforts on prevention and assistance to victims of child sexual abuse with the support and facilitation of a newly established EU Centre, and on enhancing the voluntary detection, removal and reporting of child sexual abuse online.
A very limited impact on fundamental rights may be expected with respect to the cooperation between private and public authorities. Practical measures would ensure confidentiality of data sets, which may have a positive effect on the protection of privacy and personal data compared to the baseline scenario.
This option would furthermore increase transparency and accountability and would contribute to ensuring sound administration. There would be no change with regard to legal clarity and only a moderate impact on individuals' fundamental rights. This option would maintain the current framework of voluntary measures to address CSA and of cooperation with service providers. The rights and obligations of service providers would not be substantially affected.
6.1.3.2. Option B: option A + legislation 1) specifying the conditions for voluntary detection, 2) requiring mandatory reporting and removal of online child sexual abuse, and 3) expanding the EU Centre to also support detection, reporting and removal.
Measures need to be effective, necessary and proportionate to tackle the crimes at issue and to protect the fundamental rights of children, including to give effect to the State’s obligation to provide for the protection of children’s rights and well-being, as a vulnerable group requiring particular care, and the effective application of its laws. In line with what was said above, these rights and interests need to be balanced against the following rights in particular:
Users’ rights: when data is processed for the purposes of detection, this affects users’ rights to freedom of expression and information, to the protection of personal data, and, where applicable depending on the type of service, to the confidentiality of their communications. While the rights to freedom of expression and information do not extend to protecting illegal activities aimed at the destruction of any of the basic fundamental rights and freedoms, the detection would also need to check legal materials and exchanges for the presence of CSAM. As a result, a strong justification and strong safeguards would be needed to ensure an appropriate balance of the different fundamental rights. The justification consists essentially in the particularly serious crimes that the envisaged measures aim to prevent and combat and the protection of children that it aims to ensure. As described in section 5.2.3., the safeguards could include requiring service providers to use technologies and procedures that ensure accuracy, transparency and accountability, including supervision by designated authorities. In addition, the database of child sexual abuse indicators provided by the EU Centre would ensure a reliable basis for determining which content is illegal. The transparency and accountability that the Centre helps ensure could also help ensure that there are no erroneous takedowns or abuse of the search tools to detect legitimate content (including misuse of the tools for purposes other than the fight against child sexual abuse).
For interpersonal communications services, the users’ fundamental right to privacy of communications are also concerned in particular. Therefore, supplementary safeguards would be required, including targeting the voluntary detection of new material and grooming to services where children may be at high risk, and providing clear information to users, as well as possible information once suspected abuse has been detected, including possibilities for redress. An additional safeguard lies in the anonymised processing by technologies, which ensures that the impact on the fundamental rights of users whose communications are processed would remain within reasonable limits and do not go beyond what is necessary, since no personal data deriving from their communications would be reviewed unless there is a justified suspicion of child sexual abuse (these technologies simply detect content like a virus scanner or spam filter, taking no records and not ‘understanding’ the substance of the communication, e.g. they answer the question ‘does this image contain CSA patterns?’ rather than ‘what is this image about?’).
Service providers’ rights: This option would have no impact on the rights of service providers who choose to take no action to proactively detect child sexual abuse involving their services. On the other hand, service providers who choose to do so would be subject to new requirements that have not applied previously, in addition to those arising from the DSA proposal, such as requirements on the reliability and accuracy of technologies and on reporting and removal. Such requirements however are important safeguards for the fundamental rights of users.
Regardless of whether service providers decide to take voluntary action to detect CSA, they would be subject to reporting and removal obligations in case they become aware of the existence of CSA online in their services. These obligations impact service providers’ rights but are necessary to safeguard the fundamental rights of victims.
As an additional important safeguard, the EU Centre would help improve transparency and accountability. The obligation to report would ensure that all instances of reported child sexual abuse online are independently verified, that action is taken to identify and rescue children, and that offenders are investigated. In addition, its existence would facilitate reporting to a Centre in the EU, thus limiting international transfers of personal data of EU citizens. By facilitating Member States’ action on prevention and supporting victims in removing CSAM, the Centre would have a significant positive impact on the fundamental rights of victims and children who may become victims. The Centre itself would also be subject to safeguards as described in section 5.2.3. to ensure that it carries out its responsibilities fully and in a transparent way.
On the whole, provided appropriate limits and safeguards are ensured, this option would thus fairly balance the various rights at stake.
6.1.3.3. Option C: option B + mandatory detection of known CSAM
The rights to be balanced are the same as in the previous option; the difference lies in the greater impact on rights resulting from a) the mandatory nature of the detection of known CSAM and b) its application potentially regardless of the technology used in the online exchanges.
This option, because of the expanded and more effective action against CSAM, would have a significantly positive impact on fundamental rights of victims whose images are circulating on the Internet, in particular on their right to the respect for private life, and to the rights as children.
At the same time, the mandatory nature of the detection has a notable impact on providers’ freedom to conduct their business. This can only be justified in view of the fundamental importance of tackling the particularly serious crimes at issue and more effective protection of children. Especially in the context of interpersonal communications, providers are the only ones that have visibility on the abuse taking place. Given that up to 80% of investigations in some Member States are possible only because of reports from providers, such a measure is objectively necessary. In addition, providers would have access to free and verified detection tools. The obligation to detect known CSAM would level the playing field and ensure the detection thereof where it is currently missing, with all the necessary safeguards. It would be targeted, risk-based, limited in time and would not impose an undue burden on providers.
In addition, users’ rights (in particular freedom of expression, privacy and data protection) are concerned to a greater extent than under the previous option. The availability of reliable and verified tools could ensure that the impact on their rights does not go beyond what is strictly necessary, by limiting the interference and reducing the risk of false positives and the possibility of misuse. In particular, there would be no human interaction with interpersonal communications of users beyond the communications that have been automatically identified as containing CSAM.
On the whole, provided appropriate limits and safeguards are ensured, this option would thus fairly balance the various rights at stake.
Box 19: risk of misuse of tools to detect CSA online for other purposes
There is a risk that the technologies intended to detect CSA online are repurposed and misused for other purposes. This risk is common across technologies and across technical fields, including other technologies used in online services (e.g. the GPS or the camera of a mobile phone, which could be misused for surveillance). In fact, the underlying technologies behind the most common tools to detect CSA online are in themselves applications of technologies that were not originally developed for the exclusive purpose of detecting CSA online. For example, hashing is an application of digital fingerprinting, which was already being used to detect malware when tools like PhotoDNA were first developed. Likewise, AI, the underlying technology to detect new CSAM and grooming, was not originally developed to detect CSA online. The possibility of repurposing a technology (and therefore the risk of misuse) exists since the technology is first developed. In the case of the tools to detect CSA online, these have existed for over a decade (e.g. PhotoDNA) and there is so far no evidence of that risk having materialised; the tools have been made available under a licensing agreement limiting their use to the detection of child sexual abuse content, which appears to have been respected. The legislation would include safeguards on purpose limitation, the way they are deployed, and oversight by competent authorities and the EU Centre to keep the risk of misuse to the absolute minimum.
6.1.3.4. Option D: option C + mandatory detection of new CSAM
The rights to be balanced are the same as in the previous option; the difference lies in the greater impact on rights resulting from the mandatory detection of new CSAM.
This option would represent a higher impact on providers’ freedom to conduct a business and more interference into users’ right to privacy, personal data protection and freedom of expression. However, there is corresponding increase in the types of CSA that are tackled and, thus, to the achievement of the objective of combatting the particularly serious crimes at issue and protecting children. Moreover, stricter safeguards, remedies and transparency and accountability measures would be provided for to safeguard users’ rights.
Given the similar nature of the materials to be detected and the reliance on verified indicators to be provided by the EU Centre, the detection of new material would in principle have a comparable level of intrusiveness as the detection of known CSAM. However, given that accuracy levels of current tools, while still being well above 90%, are lower than for the detection of known CSAM, human confirmation is essential. This would add to the service providers’ burdens and increase intrusiveness, but is deemed necessary to avoid errors and the negative consequences that such errors might have, including for users’ rights. The need to rely on human confirmation could decrease as the technology develops, partly as a consequence of the obligations to detect new CSAM in this option. In addition, strict requirements and safeguards would apply, including on the reliability of indicators and independent supervision, and reliable detection tools made available free of charge.
Similarly to Option C, the identification of the specific providers in scope would be done through detection orders issued by Member States’ national authorities. This ensures a case-by-case, risk-based and time-limited approach, thus contributing to the proportionality of the approach. For the detection of new CSAM a specific, higher threshold would apply (as compared to detection orders for known CSAM), i.e. only services at a high and objective risk of being misused for the exchange and dissemination of new CSAM would be subject to a detection obligation.
In light of the new nature of most previously undetected CSAM, this option would have a positive impact on victims of ongoing abuse and would significantly enhance the possibility of safeguarding victims from additional abuse. In addition, the early detection and confirmation of new CSAM and the swift addition thereof to the database of known CSAM can help limit the spreading of CSAM across service providers.
Overall, the measures in this option would therefore fairly balance the affected fundamental rights while having a significantly greater positive effect on the rights of victims.
6.1.3.5. Option E: option D + mandatory detection of grooming
The impacts of this option are the same as in Option D, with the important difference of the additional impact caused by requiring service providers to also detect grooming. The introduction of this obligation would have a higher impact on fundamental rights, which would be balanced by stricter personal data protection and privacy safeguards while providing redress, accountability and transparency.
Detecting grooming would have a positive impact on the fundamental rights of potential victims by contributing to the prevention of abuse. At the same time, the detection process would be the most intrusive one for users (compared to the detection of known and new CSAM) since it would involve searching text, including in interpersonal communications, as the most important vector for grooming. On the one hand, such searches have to be considered as necessary to combat grooming since the service provider is the only entity able to detect it. Automatic detection tools have acquired a high degree of accuracy, and indicators are becoming more reliable with time as the algorithms learn, following human review. On the other hand, the detection of patterns in text-based communications may be more invasive into users’ rights than the analysis of an image or a video to detect CSAM, given the difference in the types of communications at issue and the mandatory human review of the online exchanges flagged as possible grooming by the tool.
This obligation would be restricted to only certain specific service providers (identified, on a case-by-case basis, through the detection orders of Member States’ national authorities), which are at high risk of being misused for grooming, which would further reduce the fundamental rights impact only to the users of those services and the providers concerned. This approach would contribute to ensure the required level of proportionality.
In this option, detection obligations would apply to the three main types of CSA online (known CSAM, new CSAM and grooming). Compared to voluntary detection, which leaves to private parties the decision of whether to detect, under this option the legislator is the one taking the decision on whether to detect all three types, given the particularly serious objective of public interest at stake, setting out the conditions and safeguards under which that detection should take place.
Overall, provided appropriate limits and safeguards are ensured, the measures in this option would therefore fairly balance the affected fundamental rights while having a significantly greater positive effect on the rights of victims.
6.1.4. UN SDGs impact
6.1.4.1. Option A: practical measures to enhance prevention, detection, reporting and removal, and assistance to victims, and establishing an EU Centre on prevention and assistance to victims
Enhancing voluntary detection, removal and reporting of online CSA and the creation of the EU Centre on prevention and assistance would to some extent contribute to relevant SDGs. Notably, limiting the likelihood of girls and children in general falling victims to CSA would positively impact SDG 5.2 (eliminate all forms of violence against women girls, as a majority of CSA victims are girls) and SDG 16.2 (end abuse, exploitation, trafficking and all forms of violence against children). This option would also help to minimise the short and long-term negative health consequences of CSA and support mental health for victims and offenders or people who fear that they might offend (SGD 3: health and well-being), and address SDG 4 (education) e.g. through prevention campaigns to raise awareness of CSA online risks. This option would also affect, to a lesser extent, SDG 1 on poverty (e.g. by supporting research on long-term economic effect of CSA).
However, the overall impact of this option would be limited, as the actions would remain fragmented, and the overall reduction of the circulating CSAM would be limited.
6.1.4.2. Option B: option A + legislation 1) specifying the conditions for voluntary detection, 2) requiring mandatory reporting and removal of online child sexual abuse, and 3) expanding the EU Centre to also support detection, reporting and removal
This option would clarify the legal basis for service providers’ voluntary detection of CSA online, which, along with the expansion of the EU Centre to a broader facilitator role covering also detection, reporting and removal of CSA online, would contribute to a reduction of the prevalence of CSA and consequently a reduction of victimisation of girls (SDG 5.2), and the sexual exploitation of children in general (SDG 16.2).
This option would also address to some extent SDG 3 on health and well-being, and SDG 4 on education, similarly to option A. It would also contribute to SDG 9 (industry, innovation and infrastructure), supporting service provider’s efforts to develop technology to fight CSA online.
6.1.4.3. Option C: option B + mandatory detection of known CSAM
This option would have a positive impact on the same SDGs as option B, but stronger. The obligation to detect is expected to significantly reduce the number of CSAM available online, which would lead to a more positive impact on all SGDs described in option B, in particular SDG 5.2, and SDG 16.2.
6.1.4.4. Option D: option C + mandatory detection of new CSAM
The impacts of this option would be the same as option C, plus those of establishing a legal obligation for mandatory detection of new CSAM. The obligation to detect new CSAM would further reduce the number of CSAM available, positively impacting all SGDs described in option B.
6.1.4.5. Option E: option D + mandatory detection of grooming
The impacts of this option would be the same as option D, plus those of establishing a legal obligation for mandatory detection of grooming. The obligation to detect grooming, with its positive effects on preventing imminent crimes (and stopping ongoing ones) could lower the prevalence of CSA, positively impacting all SGDs described in option B.
6.2. Quantitative assessment
The quantification of the costs and benefits of the policy measures/policy options is limited by the lack of data, in particular on the level of abuse on services which do not currently make significant numbers of reports, as it is unclear whether this indicates a lower level of abuse on those services, or less effective efforts to detect and report such abuse. This requires the use of a number of assumptions, described in detail along with the rest of the methodology used, in annex 4, sections 3-4. Given these limitations, the estimates in this section provide an idea of the order of magnitude of costs and benefits and therefore should not be taken as exact forecasts.
6.2.1. Costs
All the policy options under consideration would result in costs for public authorities, service providers, and the Centre. Each policy option includes measures relating to prevention, assistance to victims, and detection, reporting and removal of online child sexual abuse.
In the area of prevention, costs would be incurred by the Commission as a result of the practical measures in Option A, under which the Commission would have responsibility for managing the Centre as a knowledge hub without legal personality. Under all other options, costs related to prevention measures would be borne by the Centre itself.
Costs in the area of assistance to victims would similarly by borne by either the Commission or the Centre, depending on the option chosen. In addition, measures to improve prevention and assistance to victims would likely give rise to costs for Member States.
Measures relating to the detection, reporting and removal of online CSA would entail administrative costs for service providers and public authorities under all options. These relate to the expense for service providers to implement measures to detect, report and remove online CSA, whether on a voluntary or mandatory basis, as well as the cost to both service providers and public authorities of processing each report. Under Options B to E, the Centre would also incur costs relating to the handling of reports, as well as costs for the creation and maintenance of an EU database of indicators of online child sexual abuse.
The cost model built to estimate the above costs first determined the composition of an average report today, based on the total amount of known and new CSAM files and grooming reports made in 2020. Then it estimated the cost of this average report, based on the estimated time that service providers and public authorities require for processing and following up on it (including investigations). It also estimated the number of reports in the coming years under the baseline scenario under voluntary detection, assuming that the number of reports would continue to grow in line with trends over recent years. It also assumed that the level of abuse detected and reported by Facebook, which is the top provider of reports to NCMEC, is indicative of the level of abuse that could potentially be detected and reported by other providers under mandatory detection. Finally, the model estimated the costs of each policy measure by estimating how the policy measure would change the composition of the average report and/or the number of reports compared to the baseline.
The estimated costs of each measure and option are presented in table 3 and table 4, below.
Table 3: cost estimates for the retained policy measures (EUR millions)
POLICY MEASURES
|
ONE-OFF COSTS
|
CONTINUOUS (ANNUAL)
COSTS
|
|
Public
Authorities
|
Service
Providers
|
Public
Authorities
|
Service
Providers
|
1
|
€0,4
|
€0,2
|
€3,5
|
€2,8
|
2
|
€0,0
|
€0,0
|
€10,3
|
€0,0
|
3
|
€5,0
|
€0,0
|
€25,7
|
€0,0
|
4
|
€0,0
|
€137,7
|
€11,1
|
€6,9
|
5
|
€0,0
|
€20,4
|
€3,3
|
€1,7
|
6
|
€0,0
|
€352,2
|
€503,6
|
€459,4
|
7
|
€0,0
|
€604,4
|
€250,1
|
€520,5
|
8
|
€0,0
|
€618,0
|
€28,2
|
€471,9
|
Table 4: one-off and continuous costs estimates for the policy options (EUR millions)
POLICY OPTIONS
|
ONE-OFF
COSTS
|
CONTINUOUS (ANNUAL)
COSTS
|
|
Public
Authorities
|
Service
Providers
|
Public
Authorities
|
Service
Providers
|
A
|
€0,4
|
€0,2
|
€13,9
|
€2,8
|
B
|
€5,4
|
€158,4
|
€43,6
|
€11,4
|
C
|
€5,4
|
€466,9
|
€547,3
|
€470,9
|
D
|
€5,4
|
€1.025,0
|
€797,4
|
€991,3
|
E
|
€5,4
|
€1.595,3
|
€825,6
|
€1.463,3
|
6.2.2. Benefits
The main quantitative benefits derive from savings as a result of reduction of CSA associated costs, i.e. savings relating to offenders (e.g. criminal proceedings), savings relating to victims (e.g. short and long-term assistance), and savings relating to society at large (e.g. productivity losses).
To estimate the benefits the first step is therefore to determine the total CSA costs in the EU. As indicated in section 5.1 on the baseline, the estimated annual costs of CSA in the EU are EUR 13.8 billion.
Box 20: estimation of annual costs of CSA in the EU
No studies that have estimated the total costs of CSA in the EU, or in a Member State are known to be published.
Letourneau et al. estimated the total annual costs of CSA in the US, adjusted to the reference year 2015, in a paper that appeared in 2018 in the peer-reviewed journal Child Abuse & Neglect. The paper estimated total costs including health care costs, productivity losses, child welfare costs, violence/crime costs, and special education costs, based on secondary data drawn from papers published in peer-reviewed journals. The paper indicates that its estimates of annual losses of USD 11 billion are conservative and minimum, since they could not include the economic impact of nonfatal CSA on male victims due to lack of data, and they relied on cases reported to child protection agencies, whereas it is widely recognised that a substantial proportion of CSA cases never comes to attention of child protection agencies.
For comparison, the other known study on CSA costs in the US (not peer-reviewed) estimated the annual costs in USD 23 billion. And the only other known peer-reviewed paper (in addition to Letourneau et al’s) on CSA costs estimated the annual costs in Canada in approximately CAN $3.70 billion, with a population less than 10% that of the EU.
Although Letorneau et al’s paper concerns the US, studies on the economic cost of violence against children (including child sexual abuse) suggest that costs are comparable among high-income countries. Therefore, the conservative estimates provided in the above-mentioned paper are assumed to be applicable in the EU context, when adjusted to take account of the larger population in the EU in 2021 compared to that of the US, the inflation rate 2015-2021 and the exchange rate USD-EUR in April 2021, resulting in a total of EUR 13.8 billion of annual CSA costs in the EU.
The quantitative benefits originate mainly from two sources:
·savings from CSA crimes prevented: these result not only from the options that explicitly cover prevention but also from those that cause an increase in the number of reports (e.g. those imposing detection and reporting obligations on service providers). The increase in reports is likely to lead to an increase in victims rescued from ongoing and/or imminent abuse as well as to an increase in arrests, which in turn could lead to prevention of future crimes by those offenders. It could also lead to an increase in removal of CSAM, with the positive effects on prevention that it entails (see box 1). In addition, the prosecuted offenders would have (improved) access to prevention programmes during and after criminal proceedings (including during and after prison), which could decrease reoffending. Moreover, the increase in reports could also have a deterrence effect, and thereby prevent additional offences;
·savings from better assistance of victims: these would result from a better mitigation of the negative effects of these crimes on victims, e.g. by facilitating Member States’ action in this area through the exchange of best practices and research, and supporting the takedown of images and videos (including at the victims’ request).
It is not possible to determine exactly what would be the benefits caused by each of these two sources or each policy measure, such as the obligations on service providers or the Centre. In addition, it is not possible to forecast with certitude what would be the exact benefits of each policy measure. For example, the reduction of CSA due to prevention would depend to large extent on the investments and efforts from Member States and the EU, which the policy options considered in this initiative could only help facilitate.
Considering the qualitative considerations above, it would be safe to estimate that the quantitative benefits could be up to 50% of the annual costs of CSA in the EU (remembering that the amount of EUR 13.8 billion was a conservative estimate).
The calculation of benefits for each of the options will take an even more conservative approach and assume that the benefits would be in the middle of that range, i.e. a maximum of 25% of the total annual costs. This calculation also assumes that there is a direct correlation between the factor that can be best quantified, the increase in reports, and the estimated savings. This is of course an approximation, as the savings could also derive from other components not linked to the increase in reporting, as explained above, but it facilitates the comparison of options. The model therefore assumed a cost decrease of 25% for option E (highest number of reports) and applied the same ratio of increase in reporting vs decrease in costs from option E to the other options.
Table 5: estimated benefits for the policy options (EUR million)
POLICY OPTIONS
|
Estimated number of reports
|
Estimated increase in reporting compared to the baseline
|
Estimated cost reduction
|
Benefits (millions per year)
|
Baseline
|
1.939.556
|
-
|
-
|
-
|
A
|
2.133.584
|
10%
|
0,7%
|
97,3€
|
B
|
2.385.726
|
23%
|
1,6%
|
223,8€
|
C
|
7.521.652
|
288%
|
20,3%
|
2.800,3 €
|
D
|
8.691.029
|
348%
|
24,6%
|
3.386,9 €
|
E
|
8.812.811
|
354%
|
25,0%
|
3.448,0 €
|
See annex 4, sections 3 and 4 for further details on the model, the assumptions and the calculations.
7.How do the options compare?
7.1. Qualitative comparison
7.1.1. Criteria for the comparison
The following criteria are used in assessing how the five options would potentially perform, compared to the baseline:
·Effectiveness in achieving the specific objectives.
·Efficiency, i.e. cost-benefit assessment of each policy option in achieving the specific objectives.
·Coherence with all relevant policy instruments in the fight against CSA:
a.Legislation:
I.horizontal instruments (GDPR, ePrivacy Directive and its proposed revision, e-Commerce Directive and the proposed Digital Services Act, Victims’ Rights Directive);
II.sector-specific legislation (CSA Directive, Interim Regulation, Europol Regulation and its proposed revision);
b.Coordination: EU level cooperation in investigations, prevention and assistance to victims, as well as multi-stakeholder cooperation at EU and global level;
c.Funding.
·Proportionality, i.e. whether the options go beyond what is a necessary intervention at EU level in achieving the objectives.
7.1.2. Summary of the comparison
Table 6 below summarises the qualitative scores for each main assessment criteria and each option. The options are compared below through listing positive (+), negative (-) and 'no-change' (~) impacts compared to the baseline (> indicates higher costs compared to the baseline).
The detailed comparative assessment of all options can be found in annex 4, section 2:
Table 6: summary of the comparison of policy options
|
Effectiveness
|
Efficiency
|
Coherence
|
Proportionality
|
|
|
Costs
|
Benefits
|
Leg.
|
Coord.
|
Fund.
|
|
Baseline
|
~
|
~
|
~
|
~
|
~
|
~
|
~
|
Option A
|
+
|
>
|
+
|
+
|
+
|
+
|
+
|
Option B
|
++
|
>>
|
++
|
+
|
++
|
+
|
+
|
Option C
|
+++
|
>>>
|
+++
|
+
|
+++
|
+
|
+
|
Option D
|
++++
|
>>>>
|
++++
|
+
|
+++
|
+
|
+
|
Option E
|
+++++
|
>>>>>
|
+++++
|
+
|
+++
|
+
|
+
|
7.1.3. Effectiveness
The scores on effectiveness indicate the extent to which the impacts screened in section 6 contribute to the achievement of the specific objectives.
1. Ensure the effective detection, removal and reporting of online child sexual abuse where they are currently missing
While options A and B could improve detection, removal and reporting of online child sexual abuse, their effectiveness is significantly limited by their reliance on voluntary action by providers when it comes to detection, which has proven to be insufficient. Under option A, as under the baseline, many of these activities would be prohibited following the expiry of the Interim Regulation.
Options C to E are the only options which would ensure the effective detection and reporting of online CSA. In particular, Option E would have the highest effectiveness as it would ensure that all relevant online service providers detect known and new CSAM, and grooming.
Whereas option C imposes obligations to detect only known CSAM, options D and E, impose additional, cumulative obligations to detect new CSAM and grooming respectively. As described in Section 6.1.1, the detection of new CSAM and grooming, by their nature, provide greater added value in terms of the ability to identify and rescue children from ongoing or imminent abuse. As such, the effectiveness under options D and E is higher than in option C. The obligations to detect, and report known and new CSAM and grooming are a significant step forward. Reliable tools for the detection of CSA online are already freely available and in use by a number of service providers. Extending their deployment to all relevant online services could greatly contribute to virtually eliminate the dissemination of known CSAM on such services and significantly reduce the dissemination of new CSAM, and the instances of grooming. The Centre would facilitate the detection, reporting and removal process, including by making available technology and possibly contributing to their developments through its technical expertise.
2. Improve legal certainty, transparency and accountability and ensure protection of fundamental rights
Option A, which consists of non-legislative measures, offers the least improvement in terms of legal certainty, protection of fundamental rights, transparency and accountability. Any such improvements under Option A would be largely limited to legal advice and jurisprudence and the establishment of best practices to be adhered to on a voluntary basis.
Options B to E could all offer significant improvements in these areas. Under each of these options, the conditions for voluntary detection would be clarified and mandatory measures to detect, report and remove CSA online would be established, ensuring improved legal certainty for all stakeholders. In addition, each of these options would establish robust safeguards and accountability mechanisms to ensure strong protection of fundamental rights. These would include notably the designation of a competent national authorities to assess the measures implemented by relevant online service providers, impose detection and removal orders, and impose sanctions on providers that do not meet their obligations. These options would also establish transparency obligations for both service providers and the authorities designated to receive reports from and supervise providers, as well as redress mechanisms for users, among other safeguards.
Both the baseline scenario and option A would not address the current challenges and the impact on children’s fundamental rights would likely worsen with time.
Option B would increase legal certainty for detecting CSA voluntarily and would also create an obligation to report once a provider becomes aware and remove CSAM, once confirmed to be illegal. In addition, the activities of the EU Centre would have a significant positive impact on the fundamental rights of victims and children who may become victims. The necessary safeguards would also be provided in order to balance the interference with the rights of the users and providers. However, the detection of CSA would remain voluntary, which would not ensure a consistent protection for children who are or might become victims, while there will still be an impact of privacy and data protection rights of all users. In sum, this option would have a certain negative impact on fundamental rights, particularly those of children.
Options C to E would render the detection of CSA mandatory, and, especially since the systems used for detection can affect relevant fundamental rights would include comprehensive safeguards. Furthermore, appropriate checks and balances are also to be set up, notably through sanctioning mechanisms and reporting and transparency requirements, and supervision by the competent national authorities, supported where relevant in the technical aspects by the EU Centre to prevent and counter child sexual abuse. These options would have overall small positive (Option C), significant positive (Option D) and significant positive (Option E) impacts on fundamental rights, particularly those of children.
The fundamental rights most clearly touched upon by the intervention are the following:
·Rights to human dignity and integrity of the person, prohibition of inhuman and degrading treatment and rights of the child (Articles 1, 3, 4 and 24 of the Charter).
All five options would have a positive impact in protecting the safety and rights of children. Consistent with the analysis in section 6.1.3 the positive impact is strengthened with each subsequent option. Given the seriousness of the crimes at stake and of the impact on children, being vulnerable persons entitled to protection by the public authorities, the objective pursued by the envisaged measures is capable of justifying a significant interference with the fundamental rights of other parties involved (service providers, users), provided that the interference respects the essence of those rights and remains limited to what is necessary.
·Rights to respect for private and family life, protection of personal data, and freedom of expression and information (Articles 7, 8 and 11 of the Charter).
Each of the options would have an impact on privacy and the protection of personal data, with regard to both the users of relevant online services and victims or potential victims of child sexual abuse. All options take into account the need to balance these impacts by including strong safeguards for voluntary/mandatory detection, reporting and removal of online CSA.
Evidently, the obligations imposed by Options C, D and E would have the greatest impact on overall users’ rights, especially those to privacy and on personal data protection, due to the data to be processed in the detection and the progressively increasing need for human review with each option. Furthermore, errors in the detection process could have additional negative consequences for users’ rights, such as erroneous decisions to remove users’ content, or limit access, which would impact their freedom of expression and information. At the same time, the scope for erroneous decisions is likely to be limited, especially when adequate safeguards are provided for, bearing in mind the ‘absolute’ (non-context-specific) nature of the prohibition of distributing CSAM. That holds in particular in respect of Options C and (to a somewhat lesser extent) Option D, considering the accuracy of the technologies which would need to be used.
On the other hand, the progressively increasing detection and number of reports of online child sexual abuse expected under each option would result in corresponding improvements to the rights of victims (and potential victims) to privacy and personal data. In particular, options C, D and E would contribute significantly to safeguarding rights of victims, while robust safeguards would ensure proportionality and limit interference to what is strictly necessary.
·Freedom to conduct a business (Article 16 of the Charter).
Another important element of the overall balance that has to be struck is the balance between facilitating or mandating action against CSA online and the protection of providers’ freedom to conduct a business.
The options considered in the impact assessment take into account the need to ensure that any impact upon these rights and freedoms would be strictly limited to what is necessary and proportionate, whilst leaving the essence of the freedom to conduct a business unaffected. While Options A and B would not directly or significantly affect the freedom to conduct a business, Options C, D and E would entail an interference with this freedom, while however minimising negative effects on this right by ensuring a level playing field for all providers offering services in the Union, regardless of their size or location. The interference with this right will be further mitigated by the strong support offered by the Centre, the availability of the necessary technology at no or limited costs, as well as the benefits associated with operating under a clear and uniform legal framework.
3. Reduce the proliferation and effects of CSA through harmonisation of rules and increased coordination of efforts
The non-legislative measures of Option A are less effective than the rest of the options, which includes the creation of the EU Centre to support prevention and assistance to victims, as well as detection, reporting and removal of CSA online. Practical measures can only lead to limited improvements, and cannot replace a Centre as reference entity in the EU and a facilitator on all the aspects of the fight against child sexual abuse.
7.1.4. Efficiency
Except for the baseline, all options would generate some additional administrative costs for public authorities as a result of the anticipated increase in reporting of CSA. Options C to E would lead to significant cost increases for public authorities due to the significant increase in the volume of reports of online CSA expected to arise from the obligations imposed on service providers under those options.
For service providers, all options will generate administrative and other costs, and may also result in savings when processes become more efficient. The extent of additional costs to service providers will, in part, depend upon the nature and size of their services, which is expected to affect both the volume of data to be processed for the purposes of detection and reporting, and the cost of integrating the relevant technologies.
Given the cumulative nature of the options, the costs also increase with each option, driven in particular by the increased detection obligations. These will entail a progressive increase in reports and therefore increased costs for both service providers and public authorities. On the other hand, these increased obligations would also lead to increased benefits derived from savings as a result of reduction of CSA associated costs, i.e. savings relating to offenders (e.g. criminal proceedings), savings relating to victims (e.g. short and long-term assistance), and savings relating to society at large (e.g. productivity losses).
7.1.5. Coherence
a) Legislation
Horizontal instruments
·GDPR
The proposed measures in Options B to E build on the GDPR. At the moment, various grounds for processing set out in the GDPR are invoked by service providers to carry out the processing of personal data inherent in voluntary detection and reporting of CSA online. Options B to D would specify the conditions for mandatory and voluntary detection, providing greater legal certainty for those activities.
Insofar as mandatory detection activities involving processing of personal data are concerned, options C to E would build on the GDPR’s Article 6(1)(c), which provides a legal basis for the processing of personal data to comply with a legal obligation.
·ePrivacy Directive and its proposed revision
The proposed measures in Options B to E would include service providers that offer interpersonal electronic communications services and hence are subject to the provisions of the ePrivacy Directive and its proposed revision currently in negotiations. These measures presuppose the need for a derogation from the relevant provisions of that Directive (akin to the Interim Regulation already in force, but then without limit in time and covering, where relevant, also mandatory detection) and would provide specific conditions for the processing of certain types of data otherwise subject to the ePrivacy framework.
·e-Commerce Directive
The e-Commerce Directive prohibits Member States from imposing general monitoring obligations and from actively seeking facts or circumstances indicating illegal activity. The DSA proposal confirms and restates this principle. The legislative proposal will include the necessary elements (including on objectives pursued, type of material, scope and nature of obligation, risk-based approach, limitation in time, assistance, safeguard and supervision) to ensure respect for the appropriate balancing of fundamental rights enshrined in this principle.
·The proposed Digital Services Act
Options B to E would build on the DSA’s horizontal framework, setting out a more specific framework where needed for the particular case of combating CSA online, akin to sectoral legislation such as the Terrorist Content Online Regulation, relying on the baseline provided by the DSA where possible. As regards the prohibition of general monitoring and active fact-finding obligations (which is also provided for in the DSA proposal), see the above point on the eCommerce Directive.
·Victims’ Rights Directive
Options A to E would strengthen – to an increasing extent – support to victims, in coherence with the Victims’ Rights Directive as a horizontal instrument to improve victims’ access to their rights. Options B to E would establish an EU Centre that would carry out, in addition to its principal tasks, certain tasks relating to prevention and assistance to victims, and would thus ensure greater facilitation of the cooperation with Member States and exchange of best practices, with regards to CSA victims. These options would also include measures to enhance the practical implementation of victims’ rights to stop images and videos related to their abuse from circulating and hence give fuller impact to these rights.
Sector-specific legislation
·CSA Directive
The CSA Directive is a criminal law instrument, which none of the policy options considered would contradict. In fact, strengthening prevention, detection, reporting and victim support should positively influence the implementation of the Directive and cooperation between Member States.
·Interim Regulation
Option A would contribute through non-legislative measures to the voluntary efforts by online service providers under the Interim Regulation. Once the Interim Regulation expires on 3 August 2024, there would not be another legal instrument to replace it under this option.
Options B to E specify the conditions for voluntary detection, reporting and removal of CSA online and options C to E define obligations to detect CSA online. These options would provide a long-term regulatory framework that would build on the Interim Regulation (including its safeguards) and replace it.
·Europol Regulation and its proposed revision
Under options B to E, the EU Centre would be the recipient of the reports by service providers, will review them and eventually forwarded them to Europol for action. The processing and follow up of these reports by Europol would be governed by the Europol Regulation and then by its proposed revision. This proposed revision could strengthen the fight against CSA by e.g. effectively supporting Member States and their investigations with the analysis of large and complex datasets, addressing the big data challenge for law enforcement authorities. The Centre would contribute to ensure that the data that Europol services from service providers is actionable and usable for law enforcement authorities.
b) Coordination
·EU level cooperation in investigations, prevention and assistance to victims
Option A would facilitate to a limited extent cooperation in investigations, prevention and assistance to victims. This cooperation would be higher in the case of options B to E, thanks to the Centre, whose main purpose is to serve as a facilitator of efforts, including thorough increased cooperation in those three areas.
·Multi-stakeholder cooperation at EU and global level
Likewise, the Centre in options B to E would also facilitate multi-stakeholder cooperation at EU and global level, in particular by facilitating the exchange of best practices on prevention and assistance to victims.
Under options C to E, the obligations to detect CSA online would likely entail an increase in the number of reports in other jurisdictions, in particular the US. While these obligations would apply only to services offered in the EU, the cross-border nature of these crimes means that a significant number of reports will relate to activities which involve third countries (for example, a report of grooming where the suspect and victim are located in different jurisdictions). In addition, while technology to detect known CSAM is widely used by many providers, technologies for the detection of new CSAM and grooming are less widely-deployed. It is expected that obligations to use such technologies in the EU could lead to increased voluntary use of the same technologies in relation to third countries, particularly as their distribution would be facilitated by the centre to the relevant service providers offering their services in the EU (without imposing restrictions on use outside of the EU). The amount of CSAM detected globally would increase, and with it the possibilities to stop its circulation and prevent future abuses globally. The number of cross-border investigations and opportunities to cooperate internationally, within the EU and globally, would increase.
Box 21: risk of duplication of reporting to the EU Centre and NCMEC
Mandatory reporting of CSA online to the EU Centre could lead to duplicating obligations for US service providers to make reports both in the EU and in the US. Some stakeholders have suggested that, in order to avoid duplication of reporting, any obligation to report to an EU organisation should include an exemption for providers that already report to NCMEC. This exemption would have several negative consequences, notably:
·delays for European law enforcement authorities to receive the reports due to exclusive reporting to NCMEC and losing the ability to ‘de-conflict’ reports by discovering reports having the same or similar content by cross-referencing the reports received by NCMEC, the EU Centre and Europol;
·unequal conditions and safeguards relating to the reporting obligations, since those existing under US law and those to be established under the present initiative would differ; and
·the processing of large volumes of EU user data outside the EU, by an entity not bound by EU law.
Such an exemption would therefore have a negative impact on the protection of fundamental rights, another specific objective of the initiative, and potentially lead to negative effects on international relations. Where possible within the limits sets by the applicable legislation, the implementation of technical solutions to report could help ensure that there is no confusion or unnecessary duplication of reports received by law enforcement agencies in the EU (e.g. by simply adding a tag in the report indicating whether it has been sent to the US or the EU).
In any event, the obligations under EU law would remain limited to the relevant services offered in the EU. Therefore, those obligations would not extend to services offered elsewhere.
c) Funding
The Centre under options B to E would serve as a facilitator of efforts, possibly including thorough signposting funding opportunities at EU and national level and maintaining an overview of past projects, to avoid duplication of efforts and ensure the most effective use of funds. The Centre would also facilitate research on prevention and assistance to victims, possibly by managing its own research funding.
7.1.6. Proportionality
The five options follow the same principle of proportionality and necessity of an intervention at EU level: a fragmented approach across Member States is unable to ensure an appropriate level of protection to children across the Union, and the protection of fundamental rights of all online users. Whereas the level of effectiveness of the options is different, as they contain different measures and impose different obligations, all are proportionate, as none goes beyond what is a necessary intervention at EU level to achieve the specific objectives. In addition, the conditions of application and safeguards for each option are conceived according to match its level of intrusion.
7.2. Quantitative comparison
7.2.1. Overall costs
For the purpose of comparing the options and calculating overall costs, the total combined cost (not discounted) to service providers and public authorities over a period of 10 years (2021-2030) was considered. The cost over this period was obtained by combining the one-off costs of the relevant policy measures with the sum of the annual costs for ten years. These include all costs directly arising from the measures as described in Annex 4, section 3, such as costs for the establishment of the Centre, implementation of technical measures for detection and reporting of CSA online, development of tools, processing of reports, etc.
The one-off and annual costs associated with each policy option are set out in detail in Annex 4, section 4.
Over 10 years, the total of costs per option is the following:
Table 7: comparative costs of the policy options over 10 years (EUR billions)
|
A
|
B
|
C
|
D
|
E
|
Total costs (EUR billions)
|
0.17
|
0.71
|
10.65
|
18.92
|
24.49
|
7.2.1. Overall benefits
The table below compares the estimated costs and benefits for the different options over ten years:
Table 8: comparative quantitative assessment of the policy options over 10 years (EUR billions)
|
A
|
B
|
C
|
D
|
E
|
Overall costs
|
0.17
|
0.71
|
10.65
|
18.92
|
24.49
|
Overall benefits
|
0.97
|
2.24
|
28.00
|
33.87
|
34.48
|
Total (net benefits)
|
0,81
|
1,52
|
17,35
|
14,95
|
9,99
|
The overall benefits (not discounted) assumes a decrease of 25% in the total CSA costs per year. Annex 4 contains a sensitive analysis on the % decrease in total CSA costs to determine the minimum values at which each of the options would produce net quantitative benefits. Table 9 summarises these results:
Table 9: minimum % decrease in total annual CSA costs to generate net benefits in each policy option
A
|
0,13%
|
B
|
0,6%
|
C
|
8%
|
D
|
14%
|
E
|
18%
|
8.Preferred option
On the basis of the assessment, the preferred option is E, which notably includes:
·the creation of the EU Centre in the form of a decentralised EU agency;
·mandatory detection of known and new CSAM and grooming, based on detection orders;
·an obligation to report possible CSA online to the EU Centre; and
·an obligation to remove CSA online, once confirmed as illegal.
The preferred option is the one that most effectively address the problem drivers as well as the associated costs and impacts in other areas such as fundamental rights, and achieves the objectives of the initiative. While some of the other options that are more economically convenient, the degree to which they would be less effective outweighs financial savings. However, it should be noted that the report aims to make a recommendation for the preferred option, and the final policy choice is left to the political decision maker.
The annual estimated costs of Option E, based upon the analysis in Section 6.2.1, are summarised in Table 10, below. As noted in that section, the costs were estimated primarily for the purposes of comparing the policy options. The estimates provide an idea of the order of magnitude of costs and benefits and therefore should not be taken as exact forecasts.
Table 10: annual costs of the preferred option E (EUR millions)
POLICY MEASURES
|
ONE-OFF
COSTS
|
CONTINUOUS (ANNUAL)
COSTS
|
|
Public
Authorities
|
Service
Providers
|
Public
Authorities
|
Service
Providers
|
1
|
€0,4
|
€0,2
|
€3,5
|
€2,8
|
3
|
€5,0
|
€0,0
|
€25,7
|
€0,0
|
4
|
€0.0
|
€0,0
|
€11,1
|
€6,9
|
5
|
€0,0
|
€20,4
|
€3,3
|
€1,7
|
6
|
€0,0
|
€352,2
|
€503,6
|
€459,4
|
7
|
€0,0
|
€604,4
|
€250,1
|
€520,5
|
8
|
€0,0
|
€618,0
|
€28,2
|
€471,9
|
Total
|
€5,4
|
€1.595,3
|
€825,6
|
€1.463,3
|
8.1. Main advantages
Effectively achieves the general and specific objectives: Option E would bring strong improvements in identification, protection and support of victims of child sexual abuse, would ensure effective prevention and would facilitate investigations. In particular:
·The Centre would facilitate and support coordination of efforts of all relevant actors, which would in turn reduce the proliferation and effects of CSA. This includes carrying out certain tasks entailing support for victims, which could rely on the Centre to assist them in requesting removal of known CSAM depicting them.
ðThe Centre would help boost efforts (and their effectiveness) in the overall fight against child sexual abuse in the EU, focusing on CSA online but leading in that manner also to concrete results offline.
·The legislative provisions, in particular the obligations to detect known and new CSAM and grooming, combined with the support of the Centre on detection, reporting and removal efforts, would ensure the effective detection, removal and reporting of online CSA where they are currently missing.
·The safeguards to be included in the legislation, combined with the Centre’s support to help ensure transparency and accountability in the detection, reporting and removal by online service providers, would improve overall legal certainty, protection of fundamental rights, transparency and accountability.
ðThe Centre is a fundamental component of the legislation. It serves as a key safeguard in the detection, reporting and removal process.
·The establishment of clear and uniform legal requirements at EU level, to the exclusion of diverging national rules on the issues covered, would improve the functioning of the internal market to the benefit of both providers and users. The present initiative will join other sector-specific initiatives like the terrorist content online regulation and the Copyright directives in providing more specific and stricter rules to address certain types of illegal content and activities.
Respects subsidiarity and proportionality
Subsidiarity: option E offers the highest added value of EU action described in section 3.3. In particular, it reduces legal fragmentation through the EU level legislation, and through the Centre it facilitates Member States’ action, enables the exchange of best practices and reduces dependence and increases cooperation with third countries.
Proportionality: option E does not go beyond what is necessary to achieve the general and specific objectives identified for EU intervention. In particular, the necessary measures would be taken to ensure respect for the fair balance principle underlying the prohibition to impose general monitoring or active fact-finding obligations. Also, the legislation in this option would have the legitimate purpose of more effectively tackling CSA online, including better protection of victims through more effective detection, reporting and removal, with the necessary limits and safeguards to ensure a fair balance and proportionality.
Protects fundamental rights: All options have to strike a fair balance between different fundamental rights. Of the available options, option E protects fundamental rights to human dignity and to the integrity of the person, the prohibition of inhuman or degrading treatment, and the rights of the child, among others, by boosting efforts to better prevent and protect children from sexual abuse and better support victims. In addition, option E also limits the impact on fundamental rights of users of the online services concerned, notably to the respect for private and family life, protection of personal data, and freedom of expression, among others, to the strictly necessary minimum, through the necessary limits and safeguards in the legislation, including the functions of the Centre. These conditions also ensure increasing standards over time as technology evolves, by ensuring that tools correspond to the state of the art. In particular, given the importance of the objective and the interference with the rights of users inherent in proactive detection, the decision on the limits and safeguards to detect CSA should be the legislator’s, not the service provider’s.
8.2. Main disadvantages
Implies more extensive implementation efforts and higher costs: the implementation efforts of the legislation imposing such obligations on service providers, and setting up the Centre, would likely require more time and effort and hence be more expensive than a less comprehensive instrument. The establishment of the Centre as a decentralised EU agency requires higher initial and running costs than if the Centre were established as part of an existing entity. Service providers will incur costs to comply with the legislation. Public authorities will also incur increased costs, notably to deal with the likely increase in child sexual abuse cases detected.
8.3. Trade-Offs
Better detection, reporting, prevention and victims’ assistance imply new efforts and costs
To achieve the general objective, the initiative proposes a new legislative framework for online service providers, which includes the creation of a Centre to facilitate existing and new efforts. Whereas the proposal would seek to minimise disruption, building as much as possible on ongoing efforts, it is clear that additional human, technical, and financial efforts are required to improve prevention, support of victims, and the detection, reporting and removal mechanisms. The new efforts will likely lead to an increase of detected cases, at least in the near future, before prevention efforts decrease the prevalence of the crimes.
Although option C would have the highest net economic benefit, the overall benefits for option C are still expected to be significantly lower than under option E. In addition, as set out in the qualitative comparison in section 7.1, option E appears as the best one in terms of overall qualitative scores, driven by higher effectiveness. Specifically, the detection of grooming included in option E adds a significant prevention aspect to this option, which determines its highest score on effectiveness compared to the other options. Child sexual abuse material depicts scenes of crimes already committed, and, whereas its detection contains an important prevention aspect as described in box 1, the detection of grooming focuses on preventing crimes such as hands-on abuse or sexual extortion. This avoids the short-term and long-term consequences for victims, all of which cannot be numerically quantified.
Improved detection and reporting imply a comprehensive set of conditions and safeguards
Mandatory detection of known and new CSAM and grooming has an impact on fundamental rights of all users, in particular considering that online service providers would be processing personal data, in both public and non-public (interpersonal) communications. This is a sensitive issue that requires appropriate consideration to ensure that the conditions and safeguards put in place protect the fundamental rights of all users. Likewise, the relationship with other acts of EU law (especially e-Commerce Directive/DSA and the EU data protection acquis) is a point of particular attention. This will likely require substantial time to prepare (until the legislative proposal becomes EU law) and implement.
8.4. Application of the ‘one in, one out’ approach
The ‘one in, one out’ approach refers to the principle whereby each legislative proposal creating new burdens should relieve people and businesses of an equivalent existing burden at EU level in the same policy area.
The preferred option for this initiative entails direct adjustment costs for businesses (service providers) and administrations. These are costs of complying with and adjusting their operating processes to the requirements of the proposed legislation. Examples of adjustment costs for service providers include the human and technical resources to comply with the obligations to detect, report and remove CSA online. The preferred option will also generate direct adjustment costs for administrations (notably law enforcement), due to the increased workload to deal with the increase of CSA reports.
The preferred option also creates administrative costs for service providers and administrations. These are costs that result of administrative activities performed to comply with the administrative obligations included in the proposed legislation. They concern costs for providing information, notably on the preparation of annual transparency reports.
On the other hand, the proposed legislation will replace one existing legislative instrument: the Interim Regulation. This would generate savings on administrative costs for service providers and public authorities. See Annexes 3 and 4 for additional details.
Furthermore, the initiative is expected to generate significant cost savings to society, derived from a reduction in CSA crimes (e.g. reduction in productivity losses, see section 6.2.2).
Also, the EU Centre will facilitate action of Member States and service providers in preventing and combating CSA, and support victims. This will generate cost savings, by, e.g. helping avoid duplication of efforts and facilitating a more effective and efficient use of resources.
9.How will actual impacts be monitored and evaluated?
The actual impacts of the preferred option, i.e. the actual progress in the fight against child sexual abuse offline and online, will be monitored and evaluated against the three specific objectives. The indicators would build on those of the Interim Regulation to minimise disruption and costs.
The specific objectives basically aim to improve what is being done (specific objectives 1 and 3), and how it is being done (specific objective 2). The specific objectives have corresponding operational objectives, which would be monitored using various data sources through indicators, which different actors would be responsible for collecting and sharing.
General objective
|
Specific objectives
|
Operational objectives
|
Indicators - data sources
|
Who is responsible for collection - output
|
Improve the functioning of the Internal Market by introducing clear, uniform and balanced EU rules to prevent and combat child sexual abuse
|
Improve the what”:
1.Ensure the effective detection, removal and reporting of online child sexual abuse where they are currently missing
3.Reduce the proliferation and effects of child sexual abuse through harmonisation of rules and increased coordination of efforts
|
Prevention:
·reduce CSA prevalence
·reduce duplication and blind spots of Member States’ efforts
Assistance to victims:
·provide the required assistance
·reduce duplication and blind spots of Member States’ efforts
Detection and reporting:
·detect, report and remove all CSAM, known and new, distributed online
·increase detection and reporting of grooming
|
Prevention:
·prevalence rate in Member States - surveys
·number, type and evaluation results (including best practices and lessons learned) of prevention programmes - public authorities in Member States
Assistance to victims:
·number of victims assisted and level of satisfaction of victims with the assistance provided - surveys to survivors
·number, type and evaluation results (including best practices and lessons learned) of victims assistance programmes - public authorities in Member States
Detection and reporting:
·number of reports by Member State, source (company, hotline, public), type of online service, and type of CSA online (i.e. number of images and videos, including unique/not unique and known/new, and grooming) – EU Centre
·feedback on reports: if no action taken why, if action taken outcome (number of victims identified/rescued, number of offenders convicted, and (anonymised and short) description of the case) – public authorities in Member States
|
EU Centre – annual report to the public and the Commission (extended version)
|
Commission
-implementation report every 5 years
-evaluation every 5 years,
using as sources the annual reports from the EU Centre and from providers, among others
|
|
Improve the how”:
2.Improve legal certainty, transparency and accountability and ensure protection of fundamental rights
|
·Make clear all relevant aspects of the detection, reporting and removal process by online service providers
|
·technologies used, including error rates, measures to limit the error rates, and, if the technologies are new, measures taken to comply with written advice of competent authorities – service providers
|
Service providers – annual report to supervisory authorities, the EU Centre and the Commission
|
|
Table 11: monitoring of general, specific and operational objectives
Annexes
Annex 1: Procedural information119
Annex 2: Stakeholder consultation127
Annex 3: Who is affected and how?173
Annex 4: Analytical methods181
Annex 5: Relevant legislation and policies236
Annex 6: Additional information on the problem250
Annex 7: Sample cases of child sexual abuse online in the EU267
Annex 8: Technologies to detect child sexual abuse online278
Annex 9: Encryption and the fight against child sexual abuse284
Annex 10: EU Centre to Prevent and Counter Child Sexual Abuse315
Annex 11: SME Test379
Annex 1: Procedural information
·Lead DG, Decide Planning/CWP references
This Staff Working Paper was prepared by the Directorate-General for Migration and Home Affairs (HOME).
The Decide reference of this initiative is PLAN/2020/8915.
This initiative appears in the 2021 Commission Work Programme under action 35, ‘Follow-up to the EU security strategy': Legislation to effectively tackle child sexual abuse online (legislative, incl. impact assessment, Article 114 TFEU, Q2 2021).
·Organisation and timing
Organisation
The Security Union Inter-Service Group (ISG), chaired by the Secretary-General of the Commission, was consulted at all stages of the process to prepare the impact assessment, including the inception impact assessment, consultation strategy, questionnaire for the public consultation and the various drafts of the impact assessment.
The ISG included the following Commission services: DG EMPL (DG Employment, Social Affairs and Inclusion), DG GROW (DG Internal Market, Industry, Entrepreneurship and SME), DG RTD (DG Research and Innovation), SJ (Legal Service), DG SANTE (DG for Health and Food Safety), DG TRADE, DG CNECT (DG Communications Networks, Content and Technology); DG EAC (DG Education and Culture); DG JUST (DG Justice and Consumers); DG NEAR (DG Neighbourhood and Enlargement Negotiations); ESTAT (Eurostat); DG DEFIS (DG Defence Industry and Space); DIGIT (Informatics); DG ECHO (DG Humanitarian Aid and Civil Protection); DG ENER (DG Energy); DG ENV (DG Environment); DG FISMA (DG Financial Stability, Financial Services and Capital Markets Union); FPI (Service for Foreign Policy Instruments); IDEA (Inspire, Debate, Engage and Accelerate Action); JRC (Joint Research Centre); DG MARE (DG Maritime Affairs and Fisheries); DG MOVE (Mobility and Transport); DG TAXUD (Taxation and Customs Union); DG REFORM (DG Structural Reform Support); OLAF (European Anti-Fraud Office); DG INTPA (DG International Partnerships); CERT-EU (Computer Emergency Response Team for the EU Institutions, bodies and agencies); DG BUDG (DG Budget) and DG REGIO (DG Regional Policy). It also included the EEAS (European External Action Service).
The last meeting of the ISG, chaired by the Secretariat-General, was held on 17 January 2022.
Timing - chronology of the IA
This initiative was first announced in the July 2020 EU strategy for a more effective fight against child sexual abuse, where the Commission notably committed to:
·propose the necessary legislation to tackle child sexual abuse online effectively including by requiring relevant online services providers to detect child sexual abuse on their services and to report any such abuse to relevant public authorities; and
·work towards the possible creation of a European centre to prevent and counter child sexual abuse to enable a comprehensive and effective EU response against child sexual abuse online and offline, based on a thorough study and impact assessment.
The strategy also announced the proposal for the necessary legislation to ensure that providers of electronic communications services could continue their current voluntary practices to detect in their systems child sexual abuse after December 2020. The Commission proposed this legislation (“the Interim Regulation”) in September 2020, and on 29 April 2021 there was a political agreement between the European Parliament and the Council on the text, which was then adopted by the two institutions in July 2020.
The present initiative, once adopted, would replace this Interim Regulation, among other purposes.
The Commission published an inception impact assessment on 3 December 2020. The feedback period ran until 30 December 2020. A public consultation was launched on 11 February 2021, and stakeholders and citizens had the opportunity to express their views through an online questionnaire until 15 April 2021.
While work on various aspects of the measures considered has been going on for several years, the drafting of the impact assessment itself started in October 2020 and continued until February 2022, after incorporating the feedback from the Regulatory Scrutiny Board.
·Consultation of the Regulatory Scrutiny Board
The Regulatory Scrutiny Board received the draft version of the present impact assessment report on 25 May 2021. It issued an impact assessment quality checklist on 11 June 2021.
The Regulatory Scrutiny Board issued a first negative opinion on 17 June 2021 on the draft impact assessment report. To address the feedback given by the Regulatory Scrutiny Board, the following changes were made in the report and its annexes:
Board’s comments
|
How they were incorporated in the report and annexes
|
1.The internal market dimension and the necessity for EU action in the area of prevention and victim support is not always clear
|
Changes were made throughout the report, in particular in sections 1, 2 and 3, in particular to highlight that the central focus of the legislation to is to harmonise rules for online service providers
|
2.The report does not fully describe all the available policy choices and leaves a number of questions open. It does not discuss in a transparent and balanced manner the alternative implementation forms for a European centre
|
Addition of a dedicated section (5.2.2.1) discussing the implementation choices for the EU centre.
|
3.The report does not clearly establish how safeguards will ensure fundamental rights, in particular regarding technologies to detect CSA in encrypted communications
|
Section 5 in particular was reviewed to detail the safeguards that could apply (see description of options). Section 6 was updated accordingly, including the analysis on fundamental rights.
|
4.The comparison of policy options does not comply with the standard assessment criteria and is not based on a clear and consistent ranking methodology
|
Section 7 was reviewed to notably include coherence as a comparison criterion, and a revised ranking methodology.
|
The Regulatory Scrutiny Board issued a second and final positive opinion on 17 June 2021 on the draft impact assessment report. To address the feedback given by the Regulatory Scrutiny Board, the following changes were made in the report and its annexes:
Board’s comments
|
How they were incorporated in the report and annexes
|
1.The role of the EU centre and associated costs are not sufficiently described. The implementation options for the EU centre are not presented in a sufficiently open, complete and balanced manner
|
Additional descriptions of the role of the Centre on prevention and assistance to victims added to Section 5.2.1. Additional clarifications on the role of the Centre added in sections 5.2.2., 5.2.3., 5.2.4., and 5.2.5.
Section 5.2.2. was restructured to present and analyse the options in an open, complete and balanced manner.
|
2.The report is not sufficiently clear on how the options that include the detection of new child sexual abuse material or grooming would respect the prohibition of general monitoring obligations
|
Further clarifications added in sections 5.2. and 5.2.3.
|
3.The efficiency and proportionality of the preferred option is not sufficiently demonstrated
|
Further clarifications added in section 8.3., in particular in relation to the importance and added value of grooming detection.
|
4.The scope and quantification of the cost and cost savings for the ‘one in, one out’ purposes are not clear
|
Clarifications added in section 8.4., in particular in relation to the costs and savings included in the quantification for one in, one out purposes.
|
·Evidence, sources and quality
When drafting the impact assessment report and annexes, particular attention has been given to properly reference all the sources and review their quality.
The calculations of costs and benefits were limited by the lack of data. The Commission made significant efforts to collect data, or at least estimates, from public authorities and service providers through targeted surveys. Where this information was not available, assumptions were made in the model to calculate costs, which were discussed with experts from Member States and service providers.
The evidence base includes in particular:
·external studies prepared at the request of the European Commission
§ICF et al. Study on options for the creation of a European Centre to prevent and counter child sexual abuse, including the use of ICT for creation of a database of hashes of child sexual abuse material and connected data protection issues, 2021
§ICF et al.
Study on framework of best practices to tackle child sexual abuse material online
, 2020.
§ICF, Grimaldi,
Overview of the legal framework of notice-and-action procedures in Member States
, SMART 2016/0039, 2018.
·selective list of relevant case law:
Court of Justice of the European Union:
·C-236/08 to C-238/08,
Google France SARL and Google Inc. v Louis Vuitton Malletier SA
, ECLI:EU:C:2010:159.C380/03.
·C-324/09,
L’Oréal v eBay
, ECLI:EU:C:2011:474.
·C-70/10,
Scarlet Extended SA v SABAM
, ECLI:EU:C:2011:771.
·C-360/10,
SABAM v Netlog NV
, ECLI:EU:C:2012:85.
·C-314/12,
UPC Telekabel Wien
, EU:C:2014:192.
·C-484/14,
McFadden
, ECLI:EU:C:2016:689.
·C‑18/18,
Glawischnig-Piesczek v Facebook Ireland
, ECLI:EU:C:2019:821.
European Court of Human Rights:
·Application no. 2872/02,
K.U. v. Finland
, judgment of 2 December 2008.
·Application no. 5786/08,
Söderman v. Sweden
, judgment of 12 November 2013.
·Application no. 24683/14,
ROJ TV A/S against Denmark
, decision of 24 May 2018.
·Application no. 56867/15,
Buturugă against Romania
, judgment of 11 February 2020.
Decisions of national courts:
·Antwerp Civil Court, A&M, judgment n.2010/5-6 of 3 December 2009.
·OLG Karlsruhe, judgment 6 U 2/15 of 14 December 2016.
·Rome Court of Appeal, RTI v TMFT Enterprises LLC, judgment 8437/2016 of 27 April 2016.
·Austrian Supreme Court, (Oberster Gerichtshof), decision 6 Ob 178/04a of 21 December 2006.
·Turin Court of First Instance, Delta TV v Google and YouTube, judgment No 1928, RG 38113/2013 of 7 April 2017.
·Selective Bibliography
-Carnegie Endowment for International Peace,
Moving the Encryption Policy Conversation Forward,
Encryption Working Group, September 2019.
-De Jong, R.,
Child Sexual Abuse and Family Outcomes
, Crime Science, 2 November 2015.
-Di Roia, R., Beslay, L.,
‘Fighting child sexual abuse-Prevention policies for offenders
, Publication Office of the EU, 3 October 2018.
-Farid, H.,
Reining in online abuses
, Technology and Innovation, Vol.19, p. 593-599, 2018.
-Floridi, L., & Taddeo, M. (2017).
The Responsibilities of Online Service Providers
, 2017.
-Kuhle, L., et al.,
Child Sexual Abuse and the Use of Child Sexual Abuse Images
, 9 March 2021.
-Letourneau, E.,
The Economic Burden of Child Sexual Abuse in the United States
, Child Abuse & Neglect, Vol. 79, May 2018.
-Madiega, T. (2020).
Reform of the EU liability regime for online intermediaries. Background on the forthcoming Digital Services Act
. European Parliamentary Research Service, PE 649.404, May 2020.
-Martin E, Silverstone P:
How much child sexual abuse is “below the surface”, and can we help adults identify it early
, Front Psychiatry, May 2013.
-Noemí Pereda et al.,
‘The prevalence of child sexual abuse in community and student samples: A meta-analysis’
, Clinical Psychology Review, Vol. 29, Issue 4 (2009).
-Rosenzweig, P. (2020).
The Law and Policy of Client-Side Scanning
, Lawfare, 20 August 2020.
-Scherrer, A., Ballegooij, W.,
Combating sexual abuse of children Directive 2011/93/EU, European Implementation Assessment
, European Parliamentary Research Service, PE 598.614, April 2017.
-Schwemer, S.F. (2018).
On domain registries and unlawful website content
. International Journal of Law and Information Technology, Vol. 26, Issue 4, 12 October 2018.
-Sluijs, J. et al. (2012). Cloud Computing in the EU Policy Sphere, 2011.
-Smith M. (2020),
Enforcement and cooperation between Member States - E-Commerce and the future Digital Services Act
, Study for IMCO committee, PE 648.780, April 2020.
-Stalla-Bourdillon, S. (2017).
Internet Intermediaries as Responsible Actors? Why It Is Time to Rethink the E-Commerce Directive as Well. In The Responsibilities of Online Service Providers
, 1 July 2016.
-Truyens, M., & van Eecke, P. (2016),
Liability of Domain Name Registries: Don’t Shoot the Messenger
, Computer Law & Security Review, Vol.32, Issue 2, 19 January 2016.
-Urban, J., et al.,
Notice and Takedown in Everyday Practice
, UC Berkeley Public Law Research Paper No.2755628, 22 March 2017.
-Van Hoboken, J., et al.,
Hosting intermediary services and illegal content online
: An analysis of the scope of Article 14 ECD in light of developments in the online service landscape, final report prepared for the European Commission, Publications Office of the EU, 29 January 2019.
-Wagner B., Rozgonyi K. et al.,
Regulating Transparency? Facebook, Twitter and the German Network Enforcement Act
, January 2020.
-Wilman, F.,
The responsibility of online intermediaries for illegal user content in the EU and in the US
, 20 November 2020.
·Related Impact Assessments
Impact Assessment
accompanying the Proposal on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC, SWD(2020) 348 final, 15 December 2020.
Impact Assessment
accompanying the document Regulation of the European Parliament and of the Council amending Regulation (EU) 2016/794, as regards Europol’s cooperation with private parties, the processing of personal data by Europol in support of criminal investigations, and Europol’s role on research and innovation, SWD(2020) 543 final, 9 December 2020.
Targeted subsitute Impact Assessment
on the Commission proposal on the temporary derogation from the e-privacy Directive for the purpose of fighting online child sexual abuse, European Parliamentary Research Service, PE 662.598, February 2021.
Impact Assessment
accompanying the Proposal for a Regulation of the European Parliament and of the Council on preventing the dissemination of terrorist content online, SWD(2018) 408 final, 12 September 2018.
Impact Assessment
accompanying the Proposal for a Regulation of the European Parliament and of the Council on European Production and Preservation Orders for electronic evidence in criminal matters and Proposal for a Directive of the European Parliament and of the Council laying down harmonised rules on the appointment of legal representatives for the purpose of gathering evidence in criminal proceeding, SWD(2018) 118 final, 17 April 2018.
Additional external expertise was gathered through the stakeholder consultation, as explained in detail in Annex 2.
Annex 2: Stakeholder consultation
This annex is the synopsis report of all stakeholder consultation activities undertaken in the context of this impact assessment.
1) Consultation strategy
In order to ensure that the general public interest of the EU is properly considered in the Commission's approach to the fight against child sexual abuse, the Commission has consulted as widely as possible. The consultation aimed to enable an evidence-based preparation of the future Commission initiative for a more effective fight against child sexual abuse with the help of stakeholders and had four main objectives:
·to identify current best practice, as well as challenges and gaps, and the relevant needs of all stakeholders;
·to identify ways forward that would best address those needs;
·to ensure that stakeholders (including citizens and those who would be directly affected by this initiative), can provide their views and input on the possible options for the way forward; and
·to improve the overall evidence base underpinning the initiative.
To do this, the Commission services identified relevant stakeholders and consulted them throughout the development of its draft proposal. The Commission services sought views from a wide range of subject matter experts, service providers, business associations, national authorities, civil society organisations, and from members of the public on their expectations and concerns relating to the issue of child sexual abuse and possible initiatives to prevent and combat it. These included in particular the responsibilities of relevant online service providers and possible requirements to detect and report child sexual abuse online and to report that material to public authorities, as well as the possible creation of a European centre to prevent and counter child sexual abuse.
During the consultation process, the Commission services applied a variety of methods and forms of consultation. They included:
·the consultation on the Inception Impact Assessment and the Open Public Consultation, which sought views from all interested parties;
·targeted stakeholder consultation by way of dedicated questionnaires;
·a series of workshops, conferences, expert groups, as well as bilateral meetings;
·inviting position papers and analytical papers from organizations, industry representatives, civil society and academia.
Taking into account the technicalities and specificities of the subject, the Commission services focused on targeted consultations, addressing a broad range of stakeholders at national and EU level.
2) The consultation was structured as follows:
1. Who – stakeholders consulted:
·citizens;
·service providers:
·individual companies;
·professional and business associations;
·public authorities from Member States and relevant non-EU countries:
·Ministry of Justice officials;
·Ministry of Interior officials;
·law enforcement representatives;
·legal practitioners (lawyers, prosecutors, judges);
·non-governmental organisations (NGOs);
·inter-governmental organisations (IGOs);
·EU institutions and agencies; and
·academia.
2. How – methods and tools used:
Surveys:
·Open public consultations:
oSurvey, open to feedback from any interested party, from 11 February 2021 to 15 April 2021; included a link to the Commission website on the fight against child sexual abuse
to provide further information and context.
oConsultation on the Inception Impact Assessment, open to feedback from any interested party from 2 December to 30 December 2020.
·Targeted surveys:
oSurvey for law enforcement authorities in Member States to collect information regarding the origin, quality and use of reports of child sexual abuse online that law enforcement authorities receive.
oSurvey for law enforcement authorities in Member States to collect information regarding the costs associated with reports of child sexual abuse online received by law enforcement authorities (LEAs); how the quality of reports can be improved; and the impact of encryption on investigations.
Meetings:
·Expert group meetings and bilateral meetings organised by the Commission;
·Participation in conferences and workshops organised by third parties.
In total, the dedicated consultation activities lasted two years, from February 2020 to January 2022.
The consultation was designed to follow the same logical sequence of the impact assessment, starting with the problem definition and allowing for a gradual development of the possible options and scenarios and their impacts, gradually increasing the number of stakeholders involved.
3. What – the consultation gathered feedback on the problem definition, options and impacts of these options, focused on the legislation to tackle child sexual abuse online effectively and the possible creation of a European centre to prevent and counter child sexual abuse. The diversity of perspectives proved valuable in supporting the Commission to ensure that its proposal addresses the needs, and takes account of the concerns, of a wide range of stakeholders. Moreover, it allowed the Commission to gather necessary and indispensable data, facts and views, on the relevance, effectiveness, efficiency, coherence and EU added value of the proposal. Taking into consideration the Covid-19 pandemic and the related restrictions and inability to interact with relevant stakeholders in physical settings, the consultation activities focused on applicable alternatives such as online surveys as well as meetings via video conference. The table below summarises the structure of the consultation:
Table 1: consultation strategy for a more effective fight against child sexual abuse
|
HOW
|
|
Surveys
|
Meetings
|
Conferences
|
|
Open public consultation
|
Targeted survey 1
|
Targeted survey 2
|
Group
|
Bilateral
|
|
WHO
|
Citizens
|
✓
|
|
|
|
|
✓
|
|
Service providers
|
✓
|
|
|
✓
|
✓
|
✓
|
|
Public authorities
|
✓
|
✓
|
✓
|
✓
|
✓
|
✓
|
|
Practitioners
|
✓
|
|
|
✓
|
✓
|
✓
|
|
NGOs
|
✓
|
|
|
✓
|
✓
|
✓
|
|
IGOs
|
✓
|
|
|
✓
|
✓
|
✓
|
|
EU institutions and agencies
|
✓
|
|
|
✓
|
✓
|
✓
|
|
Academia
|
✓
|
|
|
|
|
✓
|
|
Problem definition, options and impacts
|
Origin, quality and use of reports
|
Costs and quality of reports
|
Problem definition, options and impacts
|
Problem definition, options and impacts
|
Problem definition, options and impacts
|
|
WHAT
|
1.Consultation activities - summary of results
The following sections present a summary of the main results of the consultation activities.
Open public consultation
The purpose of the open public consultation was to gather evidence from citizens and stakeholders and it was part of the data collection activities that the related
inception impact assessment
announced in December 2020.
In total, 603 responses were submitted by a diverse group of stakeholders. It was addressed to a broad range of interested stakeholders, including public authorities, EU institutions and agencies, international organisations, private companies, professional and business associations, NGOs, academics and the general public.
Most feedback was received by citizens (77.93% from EU citizens, 1.84% from non-EU citizens), NGOs (10.37%), public authorities (3.51%), companies/businesses organizations (2.68%). This was followed by others (1.84%), business associations (0.84%), academic/research institutions (0.67%), as well as consumer organisations (0.33%). Additionally, around 45 position papers were received in the context of the open public consultation.
In terms of geographical distribution, most of the respondents are located in the EU, with a majority of contributions coming from Germany (45.15%), Ireland (16.22%), Belgium (4.18%) and Italy (4.18%). Internationally, the highest share of respondents that participated were from the UK (1.84%) and the US (2.51%).
Summary
Its results as far as current practices and identified gaps, legislative solutions and the possible creation of a European centre to prevent and counter child sexual abuse are concerned, can be summarized as follows:
·The public consultation revealed broad support for EU action (among all categories of respondents).
·More specifically it revealed strong support for legal certainty for all stakeholders involved in the fight against child sexual abuse online (e.g. service providers, law enforcement and child protection organisations), for future-proved legislations, for effective cooperation between stakeholders and for additional coordination and support to EU level in the fight against child sexual abuse online and offline.
What is the current situation and where are the gaps
·54.01% of the respondents state that the new legislation should aim to enable a swift takedown of child sexual abuse material after reporting.
·The new legislation should further aim to reduce the number of instances of online grooming of children, based on the feedback provided by 49.67%.
·The areas of prevention and assistance to victims of child sexual abuse should be tackled in priority according to 61.54% and 65.05% of respondents, respectively.
·Law enforcement reflected on what are the main challenges they face in their work investigating child sexual abuse cases.
·85.71% raised their concerns with regards to the increased number of child sexual abuse material in the last decade and the lack of resources (i.e. human, technical). It was followed by concerns about the underreporting of child sexual abuse cases and difficulties accessing evidence during investigation linked to the introduction of end-to-end encryption (38.1% and 47.62%). 14.29% referred to gaps in national or/and EU laws as one of the main issues.
·NGOs cooperate with law enforcement authorities in the fight against child sexual abuse, including by forwarding reports of child sexual abuse online received from the public or from service providers. 74.19% of the respondents see a need for improvement in the cooperation.
·NGOs also cooperate with services providers. Among other things, NGOs advise them on policies to fight child sexual abuse online and they also send notice-and-takedown requests to services providers. However, based on 72.58% of the replies, there is still room for improvement.
·9.68% of the NGOs respondents consider that current efforts to tackle child sexual abuse online strike an appropriate balance between the rights of victims and the rights of all users (e.g. privacy of communications) while 56.45% considered that the current efforts put too much emphasis on the rights of all users and not enough emphasis on victims’ rights.
Legislative solution: what should it include to tackle the above gaps effectively
·If online service providers were to be subject to a legal obligation to detect, remove and report child sexual abuse online in their services, most of the respondents to the public consultation agreed that services providers of social media (33.11%), image hosting (29.10%), web hosting (25.75%), message boards (23.75%), video streaming (23.58%) and online gaming (21.40%) should be subject to such legal obligation.
·In addition, if legislation were to explicitly allow online service providers to take voluntary measures to detect, remove and report child sexual abuse online in their services, providers of the following services should be included: social media (38.96%), image hosting (35.79%), video streaming (30.43%), message boards (29.10%), online gaming (26.76%).
·The respondents further reflected on the types of child sexual abuse online that the possible legislation should cover as well as on the best possible ways to achieve that as follows:
Which types of child sexual abuse online should the possible legislation cover and how?
|
Answers
|
Ratio
|
Known child sexual abuse material (i.e. material previously confirmed as constituting child sexual abuse)
|
|
|
Mandatory detection and removal
|
161
|
26.92%
|
Mandatory reporting
|
72
|
12.04%
|
Voluntary detection and removal
|
85
|
14.21%
|
Voluntary reporting
|
45
|
7.53%
|
No need to cover this in the legislation
|
161
|
26.92%
|
New (unknown) child sexual abuse material
|
|
|
Mandatory detection and removal
|
120
|
20.07%
|
Mandatory reporting
|
87
|
14.55%
|
Voluntary detection and removal
|
91
|
15.22%
|
Voluntary reporting
|
60
|
10.03%
|
No need to cover this in the legislation
|
169
|
28.26%
|
Online grooming
|
|
|
Mandatory detection and removal
|
107
|
17.89%
|
Mandatory reporting
|
107
|
17.89%
|
Voluntary detection and removal
|
84
|
14.05%
|
Voluntary reporting
|
61
|
10.20%
|
No need to cover this in the legislation
|
162
|
27.09%
|
Live – streaming of child sexual abuse
|
|
|
Mandatory detection and removal
|
156
|
26.09%
|
Mandatory reporting
|
96
|
16.05%
|
Voluntary detection and removal
|
77
|
12.88%
|
Voluntary reporting
|
46
|
7.69%
|
No need to cover this in the legislation
|
150
|
25.08%
|
·To be able to detect, remove and report child sexual abuse online, service providers need to carry out a series of actions. The respondents to the public consultation were asked to share their views concerning the proportionality of the following action, when subject to all necessary safeguards:
Proportionality of actions subjected to all necessary safeguards
|
|
Fully agree
|
Partially agree
|
Partially disagree
|
Disagree
|
To check whether images or videos
uploaded online (e.g. to a social media platform, or a file hosting service) are copies of known child sexual abuse material
|
30.77%
|
16.89%
|
8.36%
|
32.94%
|
To assess whether images or videos
uploaded online (e.g. to a social media platform, or a file hosting service) constitute new (previously unknown) child sexual abuse
material
|
22.07%
|
15.05%
|
13.04%
|
37.96%
|
To check whether images or videos sent in a private communication are copies of known child sexual abuse material
|
14.38%
|
6.52%
|
6.69%
|
60.20%
|
To assess whether the images or videos sent in a private communication constitute new
child sexual abuse material
|
14.38%
|
6.52%
|
6.69%
|
60.20%
|
To assess whether the images or videos sent in a private communication constitute new child sexual abuse material
|
12.21%
|
6.86%
|
6.02%
|
63.38%
|
To assess whether the contents of a text based communication constitute grooming
|
13.04%
|
9.70%
|
9.03%
|
54.85%
|
To assess, based on data other than
content data (e.g. metadata), whether the user may be abusing the online service for the purpose of child sexual abuse
|
14.55%
|
11.54%
|
8.86%
|
50.33%
|
·The actions to detect, remove and report child sexual abuse online may require safeguards to ensure the respect of fundamental rights of all users, prevent abuses, and ensure proportionality. According to the submitted replies, the legislation should put in place safeguards to ensure the following:
Safeguards to ensure the respect of fundamental rights of all users, prevent abuses, and ensure proportionality
|
|
Fully agree
|
Partially agree
|
Partially disagree
|
Disagree
|
The tools used to detect, report and remove child sexual abuse online reduce the error rate to the maximum extent possible
|
41.30%
|
12.21%
|
4.18%
|
13.04%
|
The tools used to detect, report and remove child sexual abuse online are the least privacy intrusive
|
49.50%
|
9.20%
|
1.67%
|
13.04%
|
The tools used to detect, report and remove child sexual abuse online comply with the data minimisation principle and rely on anonymised data, where this is possible
|
48.16%
|
8.36%
|
2.51%
|
12.71%
|
The tools used to detect, report and remove child sexual abuse online comply with the purpose limitation principle, and use the data exclusively for the purpose of detecting, reporting and removing child sexual abuse online
|
54.52%
|
4.85%
|
1.17%
|
11.20%
|
The tools used to detect, report and remove child sexual abuse online comply with the storage limitation principle, and delete personal data as soon as the purpose is fulfilled
|
51.67%
|
7.86%
|
1.84%
|
10.70%
|
The online service provider conducts a data protection impact assessment and consults the supervisory authority, if
necessary
|
38.13%
|
10.37%
|
3.85%
|
11.87%
|
Online service providers are subject to the oversight of a supervisory body to assess their compliance with legal requirements
|
36.12%
|
10.70%
|
5.18%
|
16.22%
|
Reports containing new material or
grooming are systematically subject to human review before the reports are sent to law enforcement or organisations acting in the public interest against child sexual abuse
|
38.13%
|
13.71%
|
6.19%
|
11.20%
|
All reports (including those containing only previously known child sexual abuse material) are systematically subject to
human review before the reports are sent to law enforcement or organisations acting in the public interest against child sexual abuse
|
32.61%
|
14.88%
|
8.53%
|
13.55%
|
A clear complaint mechanism is available to users
|
61.37%
|
5.69%
|
1.00%
|
6.19%
|
Effective remedies should be available to users that have been erroneously affected by the actions of the service provider to detect,
report and remove child sexual abuse online
|
62.37%
|
4.68%
|
1.00%
|
4.85%
|
Providers should make clear in the Terms and Conditions that they are taking measures to detect, report and remove child sexual abuse online
|
60.87%
|
5.18%
|
1.51%
|
5.02%
|
·In the context of possible future legislation allowing/obliging relevant online service providers to detect, report and remove child sexual abuse online in their services, 39.97% of the respondents believe that companies should be subject to financial sanctions if they fail meet the legal obligations (including safeguards) related to the detection, reporting and removal of child sexual abuse online. While 27.09% opposed to this.
·Concerning criminal sanctions, opinions were almost equally divided between those in favour of such measure (35.96%) and those against (30.43%).
·It is further noted that there is no difference between the percentage for the respondents who would agree (32.61%) and that for those who would not (32.61%), that companies that erroneously detect, remove or report child sexual abuse online in good faith should not be subject to the relevant sanctions.
·Nearly half (41.64%) of the respondents participating in the survey stressed that there should be no sanctions for failure to meet the legal obligations (including safeguards) related to the detection, reporting and removal of child sexual abuse online. At the same time, 22.57% of the replies were in favour of such measure.
·Transparency reports could refer to periodic reports by service providers on the measures they take to detect, report and remove child sexual abuse online. These transparency reports should be:
|
Yes
|
No
|
Obligatory to ensure transparency and accountability
|
46.15%
|
17.39%
|
Voluntary: an obligation would incur an additional burden on the online service providers, especially when they are small and medium enterprises
|
25.92%
|
31.77%
|
Evaluated by an independent entity
|
47.99%
|
11.37%
|
Standardised, to provide uniform quantitative and qualitative information to improve the understanding of the effectiveness of the technologies used as well
as the scale of child sexual abuse online
|
50.17%
|
11.54%
|
In addition, transparency reports should include the following information:
Transparency reports
|
|
Answers
|
Ratio
|
Number of reports of instances of child sexual abuse online reported by type of service
|
290
|
48.49%
|
Number of child sexual abuse material images and videos reported by type
of service
|
269
|
44.98%%
|
Time required to take down child sexual abuse material after it has been flagged to/by the service provider
|
265
|
44.31%
|
Types of data processed to detect, report and remove child sexual abuse online
|
285
|
47.66%
|
Legal basis for the processing to detect, report and remove child sexual
abuse online
|
279
|
46.66%
|
Whether data are shared with any third party and on which legal basis
|
317
|
53.01%
|
Number of complaints made by users through the available mechanisms and
the outcome of those proceedings
|
291
|
48.66%
|
Number and ratio of false positives (an online event is mistakenly flagged as child sexual abuse online) of the different technologies used
|
319
|
53.34%
|
Measures applied to remove online child sexual abuse material in line with the online service provider’s policy (e.g. number of accounts blocked)
|
276
|
46.15%
|
Policies on retention of data processed for the detecting, reporting and removal of child sexual abuse online and data protection safeguards applied
|
295
|
49.33%
|
·To measure the success of the possible legislation, a series of performance indicators should be monitored. In particular:
oNumber of reports of child sexual abuse online reported by company and type of service (33.78%);
oNumber of child sexual abuse material images and videos reported by company and type of service (32.78%);
oTime required to take down child sexual abuse material after it has been flagged to/by the service provider (34.78%);
oNumber of children identified and rescued as a result of a report, by company and type of service (44.31%);
oNumber of perpetrators investigated and prosecuted as a result of a report, by company and type of service (44.31%);
oNumber of related user complaints as a result of a report, by company and type of service (33.28%).
·Views were particularly divided over (i) the legal obligation of online service providers that offer their services within the EU, even when the providers themselves are located outside the EU, and (ii) the legal obligation of online service providers who offer encrypted services to detect, remove and report child sexual abuse online in their services.
Possible European centre to prevent and counter child sexual abuse
·44.65 % of the respondents see a need for additional coordination and support at EU level in the fight against child sexual abuse online and/or offline to maximize the efficient use of resources and avoid duplication of efforts.
·This could help to address existing challenges related to law enforcement action (up to 30% of the replies), preventive measures (up to 45%) as well as in the field of assistance to victims (up to 41%).
·Concerning relevant functions to support law enforcement action in the fight against child sexual abuse in the EU, survey respondents supported that possible Centre could:
oReceive reports in relation to child sexual abuse to ensure the relevance of such reports, determine jurisdiction(s), and forward them to law enforcement for action (45.82%);
oMaintain a single EU database of known child sexual abuse material to facilitate its detection in companies’ systems (39.96%);
oCoordinate and facilitate the takedown of child sexual abuse material identified through hotlines (43.98%);
oMonitor the take down of child sexual abuse material by different stakeholders (38.96).
·In order to ensure transparency and accountability regarding actions of service providers to detect, report and remove child sexual abuse online in their services, the EU Centre should:
oEnsure that the tools employed are not misused for purposes other than the fight against child sexual abuse (59.53%);
oEnsure that the tools employed are sufficiently accurate (55.69%);
oEnsure that online service providers implement robust technical and procedural safeguards (44.15%);
oDraft model codes of conduct for service providers’ measures to detect, report and remove child sexual abuse online (37.46%);
oSanction service providers whose measures to detect, report and remove child sexual abuse online, including associated technical and procedural safeguards, do not meet legal requirements (30.6%);
o Receive complaints from users who feel that their content was mistakenly removed by a service provider (50%);
oPublish aggregated statistics regarding the number and types of reports of child sexual abuse online received (46.49%).
·The EU centre would support prevention efforts in the fight against child sexual abuse in the EU:
oSupport Member States in putting in place usable, rigorously evaluated and effective multi-disciplinary prevention measures to decrease the prevalence of child sexual abuse in the EU (51%);
oServe as a hub for connecting, developing and disseminating research and expertise, facilitating the communication and exchange of best practices between practitioners and researchers (54.85%);
oHelp develop state-of-the-art research and knowledge, including better prevention-related data (51.17%);
oProvide input to policy makers at national and EU level on prevention gaps and possible solutions to address them (49%).
·In addition, the respondents reflected on the possible functions of the Centre which would be relevant to support efforts to assist victims of child sexual abuse in the EU:
oSupport implementation of EU law in relation to assistance to child victims of sexual abuse (56.35%);
oSupport the exchange of best practices on protection measures for victims (58.03%);
oCarry out research and serve as a hub of expertise on assistance to victims of child sexual abuse (56.59%);
oSupport evidence-based policy on assistance and support to victims (58.03%);
oSupport victims in removing their images and videos to safeguard their privacy (57.36%);
o Ensure that the perspective of victims is taken into account in policymaking at EU and national level (54.18%).
·With regards to the most appropriate type of organisation for the possible centre, 34.78 % of the respondents would welcome the creation of an EU body. A smaller percentage identified public- private partnerships (5.18%) and 20.90% non for profit organisations (20.90%) as the most appropriate types of organisation for the possible Centre.
·More than half of the respondents (53.51%) consider that the possible Centre should be funded directly from the Union budget, while almost 1 in 5 support the idea of mandatory levies on industry (18.73%) or voluntary contributions from industry(19.90%), and non for profit organisations(22.74%) as the most appropriate types of funding.
Problem description [current gaps and possible outcomes]
The majority of the public survey respondents, all categories included, acknowledged the online grooming of children as the most concerning type of child sexual abuse online which needs to be tackled in priority.
Public authorities
Practitioners from law enforcement and other public authorities stressed that the new legislation should reduce the number of instances of online grooming of children and enable a swift takedown of child sexual abuse material after reporting. The respondents further expect the initiative to reduce the amount of unknown child sexual abuse material distributed in the open web or via messaging applications as well as to reduce the amount of sexual material self-generated sexual by children distributed online. According to 52.38%, the new legislation should aim to ensure that child sexual abuse material stays down (i.e. that it is not redistributed online). In addition, 71.43% of the respondents highlighted the need to improve prevention as one of the main goals of the new legislation. It should further provide legal certainty for all stakeholders involved in the fight against child sexual abuse online (e.g. service providers, law enforcement and child protection organisations), and be future-proof. The new legislation could also serve to improve transparency and accountability of the measures to fight against child sexual abuse online (23.81% of the respondents).
Practitioners furthermore expressed concerns regarding the increased volume of child sexual abuse material detected online in the last decade and the insufficient human and technical resources to deal with it.
Companies
Online grooming is perceived as a challenge and should be tackled in priority according to 56.25% of the public survey respondents representing companies, who further identified the need to enable swift takedown of child sexual abuse material after reporting. They further stressed that the new legislation should prioritise the following prevention and victim support outcomes: to provide legal certainty for all stakeholders involved in the fight against child sexual abuse online (e.g. service providers, law enforcement and child protection organisations) as well as to ensure that legislation is future-proof. Improving prevention and assistance to victims of child sexual abuse was also identified as a key concern.18.75% stressed the need to enable a swift start and development of investigations, while (25% flagged that) it should also ensure a victim-centric approach in investigations, taking the best interests of the child as a primary consideration.
Non-governmental organisations
More than half of the respondents from non-governmental organisations stated that the current efforts to tackle child sexual abuse online place too much emphasis on the rights of all users and not enough emphasis on victims’ rights. 4.84% believe that the current efforts do not place enough emphasis on the rights of the users.
In their view, the new legislation should aim to reduce the number of instances of online grooming and to enable a swift takedown of child sexual abuse material after reporting, while ensuring that child sexual abuse material stays down (i.e. that it is not redistributed online) and reducing the amount of new child sexual abuse material uploaded in the open web. It should further provide legal certainty for all stakeholders involved in the fight against child sexual abuse online (e.g. service providers, law enforcement and child protection organisations) and improve transparency and accountability of the measures to fight against child sexual abuse online. Legislation should not overlook the importance of prevention and assistance to victims.
General public
Nearly half of the individuals participating in the survey flagged online grooming of children as the most concerning type of child sexual abuse online, which needed to be tackled as a matter of priority. The distribution of known and new child sexual abuse material by uploading it to the open web (e.g. posting it in social media or other websites, uploading it to image lockers, etc.), and the distribution of new child sexual abuse material via darknets were next on their list.
Among the possible outcomes that the new legislation should aim to achieve, the general public referred to the need to enable swift takedown of child sexual abuse material after reporting and to reduce the number of instances of online grooming of children. The new legislation should further aim to reduce the amount of sexual material self generated
by children distributed online (23.27%). Two thirds of the respondents stated that the new legislation should aim to improve assistance to victims of child sexual abuse, while close to half flagged the need for a victim-centric approach in investigations, taking the best interests of the child as a primary consideration. Prevention efforts should further be improved.
Cooperation between stakeholders
Public authorities referred to the inefficiencies (such as lack of resources) in public-private cooperation between service providers and public authorities as one of the main challenges while investigating child sexual abuse cases. 33.33% of the respondents further expressed concerns regarding the lack of uniform reporting procedures, resulting in variable quality of reports from service providers.
Almost 50% of the civil society organisations taking part in the survey reported that their organisations cooperate with law enforcement authorities by forwarding reports of child sexual abuse online received from the public. 13 out of 62 forward reports from service providers to law enforcement authorities, while some of them provide technology of hash lists for the detection of child sexual abuse online (7 and 4 out of 62, respectively). They also cooperate with service providers in the fight against child sexual abuse online by advising them on policies to fight child sexual abuse online, and by sending notice-and-takedown requests to service providers. However, they saw room for improvement in the area of cooperation in the fight against child sexual abuse both between civil society organisations and law enforcement authorities and between civil society organisations and service providers.
Legislative solutions
Voluntary measures
More than 75% of public authorities stated that social media, online gaming and video streaming should fall within the scope of legislation on voluntary measures to detect, remove and report child sexual abuse online.
50% of the participants representing companies were in favour of voluntary measures to detect, remove and report child sexual abuse online in social media, instant messaging, text-based chat (other than instant messaging) and message boards, among others. Concerning voluntary detection, removal and reporting of known and new (unknown) material, 25% of the replies to the open public consultation questionnaire suggested that these measures should be covered by the possible legislation. Online grooming and live-streaming of child sexual abuse should also be covered by rules on voluntary measures.
More than 55% of the representatives from non-governmental organisations suggested that social media, online gaming, web and image hosting providers should be included in legislation which would explicitly allow voluntary detection, removal and reporting child sexual abuse online. A smaller percentage (6.45%) supported that no service provider should be legally enabled to take such voluntary measures. Some respondents required a legislation which would cover not only the voluntary detection and removal of known and new (unknown) child sexual abuse material but also voluntary measures to detect and remove online grooming and live-streaming of child sexual abuse.
Over 50% of the respondents from the general public stated that no service provider should be legally enabled to take voluntary measures to detect, remove and report child sexual abuse. Around 1 in 6 (15%) individuals suggested that the possible legislation should cover the voluntary detection and removal of known and new (unknown) child sexual abuse material, online grooming and live-streaming of child sexual abuse. With regards to voluntary reporting, of all types of child sexual abuse online, around 1 in 10 (10%) of the respondents believe that it needs to be covered by the new legislation.
Mandatory detection and removal of known and unknown child sexual abuse material
Law enforcement and other public authorities, non-governmental organisations, academic and research institutions as well as other entities agreed that the new legislation should impose mandatory detection and removal of know and new (unknown) material, online grooming and live streaming of child sexual abuse. One third of the replies coming from companies suggested the mandatory reporting of different types of child sexual abuse.
Public authorities
The majority of law enforcement and other public authorities considered that social media, online gaming, video streaming, and instant messaging should be subject to obligatory detection, removal and reporting of known child sexual abuse material. More than half of the respondents (57%) thought mandatory detection and removal should also extend to new (unknown) child sexual abuse material and live-streaming.
Companies
While some companies considered that mandatory detection, removal and reporting should encompass known and unknown child sexual abuse material as well as online grooming, a majority disagreed. 31.25% of respondents suggested that no service provider should be subject to a legal obligation to detect, remove and report child sexual abuse online. They were particularly concerned about the costs for small businesses.
Business associations, whose input has to be treated with particular caution given the very small sample size, overall identified a need for legal certainty for all stakeholders involved in the fight against child sexual abuse online (e.g. service providers, law enforcement and child protection organisations). Two of three respondents thought that service providers should not be subject to a legal obligation to detect, remove and report child sexual abuse online. They proposed a more flexible reporting scheme for small and medium-sized enterprises and law enforcement authorities, always with respect to privacy efforts and principles.
Non-governmental organisations
The majority of non-governmental organisations representatives suggested that online service providers should be subject to a legal obligation to perform those actions in their services with a particular focus on social media, online gaming and video streaming, among others. On the other hand, 12.9% stressed that no service provider should be subject to such legal obligation. More than 50% of the respondents side with some other respondents in giving priority to mandatory detection and removal of known material; highlighting the importance of mandatory detection and removal of new (unknown) material and live-streaming of child sexual abuse.
General public
The majority of the individuals participating in the open public consultation argued that no service provider should be subject to such a legal obligation. They also underlined that the legislation should not include the mandatory or voluntary detection, removal and reporting of any of the proposed types of child sexual abuse (known material, unknown material, online grooming, live-streaming).
Service providers located outside the EU
It was acknowledged that a new legislation should apply to service providers that offer services within the EU, even when the providers themselves are located outside the EU. The idea has been widely accepted by public authorities, companies and civil society organisations. On the other hand, more than 50% of the general public opposed to the idea of legislation which would be applicable to service providers that offer services within the EU, when the providers themselves are located outside the EU.
Encrypted environments
Opinions are divided on the question of whether online service providers who offer encrypted services should be obliged to detect, remove and report child sexual abuse online in their services. A large majority of the respondents representing public authorities would support it, as would a majority of the respondents representing NGOs. They highlighted the importance of ensuring that any action of detection, removal and reporting should be in line with applicable human rights and privacy laws.
47.62% of the respondents from public authorities identified the introduction of end-to end encryption as a challenge in their investigative work, because it results in difficulties in accessing evidence of child sexual abuse. 80.95% also considered that relevant online service providers who offer encrypted services should be obliged to maintain a technical capability to proactively detect, remove and report child sexual abuse online in their services and platforms.
However, other stakeholders, such as civil society organisations dealing with privacy and digital rights, consumer organisations, telecommunication operators, and technology companies, raised concerns, flagging the need to preserve the balance between privacy and security; fundamental rights must be preserved, especially the right to privacy and digital privacy of correspondence. Privacy and digital rights organisations also underlined the need to preserve strong encryption.
Like other groups, business associations and individuals expressed their concerns in relation to privacy of communications. According to business associations, new legislation should put in place safeguards to limit the monitoring of private correspondence to known suspects and require judicial authorisation, rather than legally mandate it as the default position of online service providers.
Business associations further expressed concerns about the potential harm to marginalized groups and urge the need for effective encryption to ensure the online safety of groups at risk (including children, member of the LGBTQ+ community, and survivors of domestic abuse).
Service providers and digital technology industry highlighted the need to distinguish services which host and serve public, user-generated content from private messaging services and warned not to undermine, prohibit or weaken end-to-end encryption. The new legislation should take into account the key role of encryption in providing and ensuring private and secure communications to users, including children, and its integrity should be safeguarded and not weakened.
Individuals stressed that service providers should not be obliged to enforce such measures (detection, removal, reporting) in encrypted services Searching encrypted communications in their view would require adding backdoors to encryption technology and thus threaten to weaken the security of communications in general, which many citizens, businesses and governments rely on.
Safeguards
The actions to detect, remove and report child sexual abuse online may require safeguards to ensure the respect of fundamental rights of all users, prevent abuses, and ensure proportionality.
Public authorities
Public authorities agreed that the legislation should put into place safeguards to ensure the respect of fundamental rights of all users, prevent abuses and ensure proportionality. In particular, the tools used to detect, report and remove child sexual abuse online needed to comply with the data minimization principle and rely on anonymised data where this is possible. The tools should further comply with the purpose limitation principle, and use the data exclusively for the purpose of detecting, reporting and removing child sexual abuse online. Some respondents warned as to the challenges relating to the data retention period and the legislative compliance assessment of online service providers.
Companies
About half of company respondents also highlighted that the tools used to detect, report and remove child sexual abuse online should be the least privacy intrusive, comply with the data minimization principle and rely on anonymised data where possible. Close to half stated that the new legislation should also include safeguards to ensure that reports containing new material or grooming are systematically subject to human review before the reports are sent to law enforcement or organisations acting in the public interest against child sexual abuse. Data should be used exclusively for the purpose of detecting, reporting and removing child sexual abuse online and the tools used should comply with the storage limitation principle.
Non-governmental organisations
Service providers’ actions to detect, remove and report child sexual abuse online need to be proportionate and subject to safeguards, according to NGO respondents. Most of the respondents agreed on the need for a clear complaint mechanism for users. A significant majority stressed that effective remedies should be provided to users that have been erroneously affected by the actions of the service provider to detect, report and remove child sexual abuse online. Furthermore, most deemed essential that service providers would make clear in the Terms and Conditions that they are taking measures to detect, report and remove child sexual abuse online.
General public
Concerning safeguards, more than half of individual respondents flagged the need to ensure the availability of a clear complaint mechanism
and effective remedies
for users that have been erroneously affected. Slightly more than half also thought it was important that providers made clear in the Terms and Conditions that they are taking measures to detect, report and remove child sexual abuse online,
as well as to ensure that the tools used to detect, report and remove child sexual abuse online are the least privacy intrusive.
Sanctions
50% of the respondents from companies and 60% business associations stated that online service providers that erroneously detect, report or remove child sexual abuse online in good faith should not be subject to financial or criminal sanctions. 60% of the respondents from business associations disagree with imposing criminal sanctions to companies if they fail to meet the legal obligations related to detection, reporting and removal of child sexual abuse online. Detection and removal, in their view, were best placed as part of voluntary requirements to encourage innovation to further develop and deploy technology in this area, while it was also seen as crucial to support national law enforcement authorities responsible for pursuing and prosecuting crimes related to CSAM.
General public
Around 26% of the respondents suggested that companies should not be subject to any financial or criminal sanctions while 19.92% and 15.72% believe that companies should be subject to financial and criminal sanctions, respectively.
Transparency reports and performance indicators
Three quarters of public authorities and non-governmental organisations underlined that transparency reports should be obligatory
,
and standardized
,
in order to provide uniform quantitative and qualitative information to improve the understanding of the effectiveness of the technologies used as well as the scale of child sexual abuse online
Public authorities
More than 80% of law enforcement and other public authorities expect transparency reports to include information on the number of reports of instances of child sexual abuse online reported, by type of service. They also highlighted that reports, as well as the number of perpetrators investigated and prosecuted as a result of a report, by company and type of service, should be taken into account in assessing the success of the possible legislation. The number and ratio of false positives (an online event is mistakenly flagged as child sexual abuse online) of the different technologies used should also be included, based on the 38% of the replies.
Companies and business associations
Close to half of respondents thought that transparency reports should include information on whether data are shared with any third party and on which legal basis, as well as information related to the policies on retention of data processed for the detecting, reporting and removal of child sexual abuse online and the data protection safeguards applies. The number and ratio of false positives (an online event is mistakenly flagged as child sexual abuse online) of the different technologies used should be also taken into account. The size of each organisation and enterprise should be taken into account to ensure that they have the necessary infrastructure in place to respond to any regulatory and/or supervisory requirements.
Non-governmental organisations
82.26% of the replies coming from non-governmental organizations, flagged that reports should include information about the time required to take down child sexual abuse material after it has been flagged to/by the service provider while the measures applied to remove online child sexual abuse material in line with the online service provider’s policy (e.g. number of accounts blocked) identified as an important element of a transparency report by 80.65% of the respondents.
General public
According to individuals, the success of the possible legislation should be monitored based on the number of victims identified and rescued and the number of perpetrators investigated and prosecuted as a result of a report, by company and type of service.
Academia
75% of academic and research institutions supported the idea of transparency reports which would be obligatory, and evaluated by an independent entity. They further stated that these reports need to be standardized in order to provide uniform quantitative and qualitative information to improve the understanding of the effectiveness of the technologies used as well as the scale of child sexual abuse online.
European centre to prevent and counter child sexual abuse
There is broad consensus among all respondents on the need for additional coordination and support to EU level in the fight against child sexual abuse online and offline. Stakeholders further emphasized the need to avoid duplication of efforts.
In the area of prevention, overall, respondents supported an EU initiative to create an EU Centre to stimulate the exchange of best practices and research and cooperate with non-governmental organizations, law enforcement authorities, educational institutions and academia, and experts, with a view of facilitating the coordination of actions undertaken by competent authorities and relevant stakeholders.
The majority of the respondents, all categories included, reflected that a possible EU Centre would serve to support Member States in putting in place usable, rigorously evaluated and effective multi-disciplinary prevention measures to decrease the prevalence of child sexual abuse in the EU.
Public authorities
Law enforcement and other public authorities confirmed almost unanimously the need for additional coordination and support at EU level in the fight against child sexual abuse online and offline
, to maximize efficiency and avoid duplication. A coordinated response at EU level (and beyond) could deal with challenges related to law enforcement, prevention and assistance to victims.
Among the most widely supported functions of the EU Centre, to support law enforcement, respondents acknowledged the need to maintain a single EU database of known child sexual abuse material to facilitate its detection in companies’ systems. The EU Centre would further help ensure the relevance of the received reports, determine jurisdiction(s), and forward them to law enforcement for action. In addition, the EU Centre would support law enforcement authorities to coordinate and facilitate the take down of child sexual abuse material identified through hotlines. Regarding the implementation of robust technical and procedural safeguards, respondents flagged it is critical in order to ensure transparency and accountability as regards the actions of service providers. Coordinated actions on a global level, law enforcement cooperation, and exchange of best practices as well as proper resources distribution and support noted as key actions to stop the cycle of abuse.
Practitioners from law enforcement or other public authorities
acknowledged the key role of the implementation of EU law in relation to assistance to victims of sexual abuse while highlighting the importance of cooperation with different stakeholders in the area of victim protection, assistance and support
. Identification of possible legislative gaps, research, and victim’s participation, awareness raising campaigns, proper education and training were further listed amongst the suggested measures and good practices. A majority of the respondents would welcome the creation of an EU body
. 4.76% identified public- private partnerships and non for profit organisations as the most appropriate types of organisation for the possible Centre. The Centre should be funded directly from the Union budget (90.48% of the replies); or to receive funding from voluntary contributions from industry or non for profit organisations (28.57% and 23.81% of the replies, respectively).
Companies
37.5% of the survey participants representing companies and business organisations confirmed the need for additional coordination and support at EU level in the fight against child sexual abuse online and offline, to maximize the efficient use of resources and to avoid duplication of efforts. Companies and business organisations representatives reflected that the Centre should be serve as a hub for connecting, developing and disseminating research and expertise, facilitating the communication and exchange of best practices between practitioners and researchers, to support prevention efforts Furthermore, the role of the Centre would be relevant to support efforts to assist victims of child sexual abuse. The Centre could further support the exchange of best practices on protection measures for victims and further support victims in removing their images and videos to safeguard their privacy. At the same time, it is crucial to ensure that the perspective of victims is taken into account in policymaking at EU and national level.
Like other groups, most of the respondents considered that the possible Centre should be funded directly from the Union budget, while 18.75% support voluntary contributions from industry or non for profit organisations as the most appropriate type of funding.
The idea of the creation of an EU Centre to prevent and counter child sexual abuse had found broad support from business associations. The EU Centre can play a key role in the fight against child sexual abuse and exploitation if designed to complement and build upon the existing infrastructure. The EU Centre should remain in full harmony and cooperation with other bodies to avoid duplication of efforts and a conflict of reporting obligations to avoid an impact on the efficiency of the system. Additional coordination and support at EU level is needed to improve the sufficiency of communication and exchange of best practices between practitioners and researchers in the area of prevention. In parallel, it was seen as critical to publish aggregated statistics regarding the number and types of reports of child sexual abuse online received in order to ensure transparency and accountability regarding actions of service providers.
Non-governmental organisations
The majority of respondentsconfirmed the need for additional coordination and support at EU level in the fight against CSA online and offline. Most of the participants from non-governmental organisations identified as main challenges in the fight against child sexual abuse that could benefit from additional support and coordination at EU level, the lack of evaluation of the effectiveness of prevention programmes as well as the insufficient communication and exchange of best practices between practitioners (e.g. public authorities in charge of prevention programmes, health professionals, NGOs) and researchers, both in the area of prevention and in relation to the assistance to victims.
Respondents from non-governmental organisations acknowledged, as the most relevant functions of the EU Centre to support law enforcement, the need to monitor the take down of child sexual abuse material by different stakeholders as well as to maintain a single EU database of known child sexual abuse material to facilitate its detection in companies’ systems. In parallel, they agreed that, it is critical, amongst others, to ensure that the tools employed are sufficiently accurate, and are not misusedfor purposes other than the fight against child sexual abuse. Non-governmental organisations further acknowledged the key role of the implementation of EU law in relation to assistance to victims of sexual abuse while highlighting the need for supporting the exchange of best practices on protection measures for victims and the importance of an evidence-based policy on assistance and support to victims. Support victims in removing their images and videos to safeguard their privacy and ensure that the perspective of victims is taken into account in policymaking at EU and national level were also identified as key functions of the future Centre in the area of assistance to victims.
Amid the respondents from non-governmental organisations, 22 welcomed the idea of an EU body, as the most appropriate type for the possible Centre. That was followed by public-private partnership (11.29%) and not for profit organisation (12.9%). 79.03% welcomed the idea of an EU Centre which will receive EU funding. Mandatory levies on industry (33.87%), voluntary contributions from industry (20.97%) or not-for-profit organisations (17.74%) included in the list.
General public
Additional coordination and support at EU level could be beneficial in the context of prevention and assistance to victims, in particular to tackle the lack of evaluation of the effectiveness of prevention programmes in place as well as the effectiveness of programmes to assist victims. Individuals further identified the lack of an EU approach
(i.e. based on EU rules and/or mechanisms) to detect child sexual abuse online and in particular lack of a single EU database to detect known child sexual abuse material (24.11 %) and the lack of an EU approach to determine relevant jurisdiction(s) of the instances of child sexual abuse online and to facilitate investigations (28.93%) as main challenges.
In order to ensure accountability and transparency regarding actions of services providers to detect, report and remove child sexual abuse online in their services, the Centre should ensure that the tools employed are not misused for purposes other than the fight against child sexual abuse
. 42.77% of the individuals consider that the Centre could receive complaints of users who feel that their content was mistakenly removed by a service provider, and ensure that the tools employed are sufficiently accurate.
In the area of prevention, the Centre could serve as a hub for connecting, developing and disseminating research and expertise, facilitating the communication and exchange of best practices between practitioners and researchers
. The Centre could further carry out research and serve as a hub of expertise on assistance to victims of child sexual abuse as well as support the exchange of best practices on protection measures on victims. Support victims in removing their images and videos to safeguard their privacy and ensure that the perspective of victims is taken into account in policymaking at EU and national level were also identified as key functions of the future Centre in the area of assistance to victims. Almost 50% of the respondents agreed that the new Centre should receive direct funding from the Union budget. Voluntary contributions from not-for-profit organisations (24.11%) or from industry (19.71%) and mandatory levies on industry (17.61%) were next on the list.
Academia
Academics and researchers fully support the idea of the creation of an EU Centre to face the challenges in the area of prevention. The Centre could support Member States in putting in place usable, rigorously evaluated and effective multi-disciplinary prevention measures to decrease the prevalence of child sexual abuse in the EU. Providing help to develop state-of-the-art research and knowledge, including better prevention-related data to monitor the take down of child sexual abuse material by different stakeholders could also be a key function of the possible Centre. It could further serve as a hub for connecting, developing and disseminating research and expertise, facilitating the communication and exchange of best practices between practitioners and researchers, and providing input to policy makers at national and EU level on prevention gaps and possible solutions to address them.
Practitioners from academic and research institutions further acknowledged the key role of the implementation of EU law in relation to assistance to victims of sexual abuse while highlighting the importance of cooperation with different stakeholders in the area of victim protection, assistance and support. All the respondents from academic and research institutions would welcome the creation of an EU body which should be directly funded from the Union budget.
Inception Impact Assessment
In total, 41 replies were submitted: 13 by non-governmental organisations, 11 by companies and business organisations, 2 by public authorities, 2 by EU citizens, 1 by academia/research institutions, 2 by business associations, and 10 by other entities (e.g. UNIFEC, Global Partnership to End Violence against Children, etc.). Interested stakeholders could provide feedback to the Inception Impact Assessment from 2 to 30 December 2020.
The Inception Impact Assessment aimed to inform citizens and stakeholders about the Commission's plans in order to allow them to provide feedback on the intended initiative and to participate effectively in future consultation activities.
The feedback gathered in reaction to the Inception Impact Assessment shows that, in summary, the initiative enjoys significant support as the stakeholders welcome the Commission's efforts to tackle child sexual abuse online. Providing legal clarity and certainty as well as the holistic approach of the proposed Centre are seen as the main positive attributes of the proposal. Some concerns regarding mandatory reporting, however, arise amongst different actors. The business representatives are primarily concerned about the duplication of reports and the disadvantageous impacts on SMEs. Furthermore, some believe the legislation should be future proved based on the dynamic development of technology.
Table 1: Origin of valid feedback by category of respondent
Voluntary measures
Companies
Companies and business organisations call for an EU framework allowing continuing voluntary measures to detect report and remove CSAM on their platforms. Many efforts undertaken by companies to tackle CSAM have already been successful on a voluntary basis e.g. the development of tools such as PhotoDNA. Mandatory detection of known and new CSAM could have serious consequences. A legal requirement to apply such tools risks incentivizing companies towards prioritizing removal over accuracy, and could effectively amount to an obligation to screen all content. Taking into account the limited capability of small and medium-sized companies (SME), voluntary measures to detect CSAM online should be given preference. Reporting mechanisms should be flexible to avoid burdensome requirements for SMEs and overburden LEA. A harmonized approach across the EU, including definitional clarity and exchange of best practices will increase the effectiveness of online platforms’ voluntary efforts.
Legal certainty regarding the detection of child sexual abuse material is fundamental. Any new EU legal instrument needs to provide sufficient legal basis for online platforms to continue to operate their detection.
Other entities/stakeholders
Most of the contributions from business associations illustrated that any legislation should take into account the limited capability of small and medium-sized companies (SME). Thus, voluntary measures to detect CSAM online should be given preference. The different (technical and financial) capabilities of SMEs could not be taken into consideration within a legislative framework that imposes mandatory measures. Companies could be safeguarded by creating a legal framework allowing voluntary proactive measures under clear conditions securing compliance with fundamental rights.
Obligation to detect known CSAM
An obligation to detect known CSAM is expected to have a significant impact on SMEs in terms of capacity, resources and economics. Especially SMEs do not always have access to essential tools to detect CSAM as well as resources to develop this kind of tools. Using external tools or services can be challenging for small operators, as understandable legal restrictions on the ability to access CSAM.
Companies
Some of the contributions from companies and business associations urge the Commission to take into consideration the potential financial and technical burden that would be placed on smaller companies as a result of the adoption of binding legislative measures The data privacy and customer security issues were also highlighted as important among companies.
One the other hand, it was flagged that a legal framework which would create a binding obligation for relevant service providers to detect, report and remove known child sexual abuse material from their services could encourage improvement and provide legal certainty. Simple and streamlined reporting obligations that avoid duplication and confusion in a well-functioning system is essential. Participants further underlined the need for transparency reporting obligations to be reasonable, proportionate, and based on clear metrics.
Other entities/stakeholders
The detection, removal and reporting of child sexual abuse online is a necessary element in the broader fight against the exploitation of children and the protection of their fundamental rights. Any legal framework that is put in place in pursuit of these objectives will need to encompass binding obligations for relevant service providers, on a proportionate basis, and including necessary safeguards. It should ensure legal certainty, transparency and accountability.
Obligation to detect new and known CSAM
Like already mentioned above the legislative option to detect new and known CSAM would have a significant impact on SMEs. Such proposal to mandate the detection and removal of ‘new’ materials must consider technical realities.
Companies
The responding companies and business associations said there is a need to to formulate requirements in terms of best reasonable efforts at the current state of technology. In addition, that obligations could be differentiated on the basis of size and capability of small and medium enterprises (SMEs) to avoid putting excessive burdens on them. It was further stated that a legal obligation for relevant service providers to detect, report and remove child sexual abuse from their services, applicable to both known and new material, and to text-based threats such as grooming would currently be in contravention of existing EU law (and the proposed DSA) regarding the prohibition of general monitoring efforts, and would also be a more difficult and costly implementation, especially for the smallest platforms.
Participants further underlined the need for transparency reporting obligations to be reasonable and proportionate. Simple and streamlined reporting obligations that avoid duplication and confusion in a well-functioning system is essential.
Non-governmental organisations
Non-governmental organisations called for long term legislation that makes reporting and removal of child sexual abuse material and grooming on their platforms mandatory for service providers. Mandatory detecting, reporting and removal requires a holistic approach with close cooperation between relevant service providers and stakeholders. As it was further flagged, it is vital that the objectives and obligations are consistent and compatible with the measures set out in the Digital Services Act, particularly around transparency and reporting mechanisms. Any policy and legislative options shall incorporate the strongest available safeguards and address the need for greater transparency and accountability within the industry. The Commission needs to provide legal clarity and certainty as well as to adopt a victim-centred approach. The new legislation must be flexible and future-proof.
Among others, it was stressed that voluntary measures does not meet the overall objectives of the initiative, which means that efforts to counteract child sexual abuse will continue to be fragmented and insufficient.
Other entities/stakeholders
The contributions recognised the importance of legal certainty, transparency and accountability. Any legal framework that is put in place in pursuit of these objectives (detection, removal and reporting of child sexual abuse online) will need to encompass binding obligations for relevant service providers, on a proportionate basis, and including necessary safeguards. In addition, any new initiative should take into account the best interest of the child as well as ensure that functional prevention measures and victim support services are in place.
Encryption
Public authorities
The great importance of balancing the protection of privacy and the confidentiality of communication with the legal interests concerned was specifically highlighted among public authorities.
Companies
Companies’ representatives urged for legal certainty for the processing of personal data for the purpose of detecting child sexual abuse material. They further stressed that end-to-end encryption must be preserved; any framework should not undermine, prohibit or weaken end-to-end encryption.
Several parties further advised against requirements to weaken and break encryption and recommend instead that appropriate measures are taken so that content can be detected at the endpoints of encrypted communications, whenever appropriate. It was of utmost importance that the legislative solution chosen remains proportionate to the very purpose of the fight against CSAM.
It was also stressed that any new EU framework should define adequate safeguards efficiently balancing the digital safety interests with users' privacy rights.
Non-governmental organisations
A few stakeholders have shared views on encryption. Specifically, it was recommended that the regulation would include a requirement for service providers of encrypted services to at the minimum facilitate reporting of CSAM and CSE online, including self-generated material, and prompt action to remove confirmed materials upon request from hotlines and law enforcement authorities.
The need for clear legislative frameworks that allow online CSEA to be detected, removed and reported efficiently in order to safeguard the rights of existing victims but also to prevent abuse from occurring in the first place, protecting the privacy of some of the most vulnerable users of online services, was further underlined. Appropriate and realistic rules should be adopted to ensure the roll out of tools scanning text for potential CSE and CSA in line with the GDPR.
European centre to prevent and counter child sexual abuse
Public authorities
The possible creation of a European Centre would create a common front for the harmonization of European legislation in order to prevent and protect children.
Companies
Overall, representatives from companies and business organisations recognised the importance of the role of an EU Centre to prevent and counter child sexual abuse. Among the objectives identified objectives are, the role of the Centre as a hub to provide information regarding programmes, services and legislation that could benefit exploited children; as well as to develop and disseminate programmes and information to law enforcement agencies, nongovernmental organisations, schools, local educational agencies, child-serving organisations, and the general public on the prevention of child sexual abuse exploitation; internet safety, including tips for social media. Provide adequate assistance and support to victims (and their families) as well as specialized training to law enforcement authorities, civil society organisations and the general public.
Non-governmental organisations
Non-governmental organisations welcomed the idea of a European centre to prevent and counter child sexual abuse, which could play an important role in strengthening the global effort to combat child sexual abuse online. Participants pointed out that the existence of a European Centre would help to ensure continued and improved implementation of the European Directive on combating the sexual abuse and exploitation of children as well as to share and promote learning and best practice, and provide rigorous evaluation of existing responses to child sexual abuse.
Address early intervention and prevention of predatory behaviour, as complementary to the detection and identification of perpetrators and child victims is key.
They also flagged the need to enhance global and multi-stakeholder cooperation and enable a coherent approach to tackle child sexual abuse, online and offline. The Centre’s functions could include initiatives to improve victim support, law enforcement and prevention. This must be against a wider background of support for children’s rights. Legislation and regulations that may be overseen by the Centre have to prioritize these rights.
Other entities/stakeholders
Respondents noted that the proposed European centre to prevent and counter child sexual abuse may address some of the challenges relating to coordination and/or duplication of efforts among different stakeholders. The European centre to prevent and counter child sexual abuse and exploitation could also play a critical role to promote enhanced cross-sector collaboration and engagement modalities, particularly with industry players.
Focusing on the legal framework, a clear legal framework should be developed to empower and protect hotlines engaged in handling and accessing illegal material. For effective investigations and prosecutions, law enforcement authorities need adequate staffing and technical solutions. Currently, there seems to be a lack of resources resulting in delays of analysing hard disks etc. after house searches, and identification of victims and offenders. In addition, it should be taken into account that citizens are often afraid or reluctant to report CSAM to law enforcement authorities directly.
There is an additional need to ensure that the new Regulation and the possible EU centre are fully aligned with relevant EU initiatives as well as legislations, policies and regulations addressing related matters such as other forms of violence.
The EU Centre could further enable improved educational opportunities in schools within the framework of media literacy for both children and parents. It was also highlighted as an important element towards the fight against child sexual abuse, the increased attention to prevention of offending and victimization of children as the best approach to achieve sustainable results at scale and ultimately ensure that children are safe in digital environments. Ensure the views of children are heard and facilitate appropriate ways for meaningful child participation throughout the consultation, decision making and implementation processes.
Academic / research institutions
Academic and research institutions welcome an effort to establish an EU centre to support the effective prevention of child sexual abuse and to help ensure coordinated post-abuse reporting, detection and intervention efforts.
Targeted survey 1 – Law enforcement authorities
The replies to Targeted Survey 1 revealed that:
·Origin of reports:
oFor most EU law enforcement authorities responding (61%), reports received from service providers, either through NCMEC or directly, constitute the single largest source of reports of child sexual abuse online.
oIn the case of 45% of EU law enforcement authorities responding, NCMEC reports amounted to more than half of all reports received.
Participants were asked several questions regarding the origin and quality of reports of child sexual abuse online received by their organisation. Participants were asked to provide data in respect of several possible sources of reports:
·NCMEC;
·Members of the public;
·The respondent’s own organisation (e.g., based upon a lead arising in another investigation);
·Other public authorities (including law enforcement authorities) in the same country;
·Public authorities (including law enforcement authorities) in another country;
·National hotlines in the same country;
·National hotlines in another country;
·Directly from service providers; and
·Other sources.
EU law enforcement authorities were invited to participate via EMPACT. Following the validation of data after the survey closed, there were responses from 49 law enforcement authorities in 16 Member States.
Origin of reports
Participants were asked to respond to the following survey question:
‘To understand the various sources of child sexual abuse reports that you receive, please estimate the percentage of reports from each of the sources (the total should be around 100%)’
For each of the possible sources, participants were required to select the percentage range corresponding to the approximate percentage of reports received from that source.
Quality of reports
Participants were asked to respond to the following survey question:
Question: ‘To understand the quality of the child sexual abuse reports that your organisation receives, please estimate the percentage of reports that are actionable (i.e. that can be used to start an investigation) for each of the different sources’
For each of the possible sources, participants were required to select the percentage range corresponding to the approximate percentage of reports from that source that are typically actionable.
Table
2 shows, for each source, the number of EU law enforcement authorities that estimated that the percentage of reports received by their organisation falls into each of the percentage ranges.
Table 2: Number of respondents answering that a given percentage of reports of CSA online are received from each source
|
0-10%
|
11-20%
|
21-30%
|
31-40%
|
41-50%
|
51-60%
|
61-70%
|
71-80%
|
81-90%
|
91-100%
|
Cannot Estimate / No Answer
|
NCMEC
|
6%
|
8%
|
12%
|
14%
|
10%
|
6%
|
6%
|
10%
|
20%
|
2%
|
4%
|
Public
|
47%
|
22%
|
4%
|
12%
|
4%
|
0%
|
0%
|
2%
|
0%
|
0%
|
8%
|
Own organisation
|
47%
|
22%
|
8%
|
2%
|
0%
|
2%
|
2%
|
0%
|
0%
|
0%
|
16%
|
Other public authorities (same country)
|
37%
|
22%
|
16%
|
4%
|
0%
|
6%
|
0%
|
0%
|
0%
|
0%
|
14%
|
Other public authorities (different country)
|
59%
|
18%
|
4%
|
0%
|
0%
|
2%
|
0%
|
0%
|
0%
|
0%
|
16%
|
Hotline (same country)
|
67%
|
8%
|
0%
|
0%
|
0%
|
0%
|
0%
|
0%
|
0%
|
0%
|
24%
|
Hotline (different country)
|
61%
|
0%
|
0%
|
0%
|
0%
|
0%
|
0%
|
0%
|
0%
|
0%
|
39%
|
Service providers (directly)
|
51%
|
4%
|
2%
|
0%
|
0%
|
0%
|
0%
|
0%
|
0%
|
0%
|
43%
|
Other
|
31%
|
2%
|
2%
|
0%
|
2%
|
0%
|
0%
|
0%
|
0%
|
0%
|
63%
|
Table 3: Percentage of respondents answering that more than 50% and 70% of reports received from a given source are actionable
Participants were also asked to respond to the following survey question:
‘What are the main reasons that make a report non-actionable?’
For each of the possible sources, participants were required to select the typical reasons which lead to a report from that source being non-actionable. There was no limit on the number of reasons that could be selected for each source. Reasons were to be selected from the following list, with the option for respondents to specify other reasons:
·Reported content is not illegal under national law;
·Insufficient information contained in report;
·Report relates to reappearance of known content;
·Insufficient resources;
·Investigation not promising;
·Other (please specify)
Use of reports (investigations)
Participants were asked to respond to the following survey question:
‘To understand how investigations of child sexual abuse typically start, please estimate the percentage of investigations that start with a lead from each of the sources below (the total should be around 100%)’
For each of the possible sources, participants were required to select the percentage range corresponding to the approximate percentage of reports received from that source.
Targeted survey 2 – Data regarding reports of child sexual abuse online received by law enforcement authorities
Time required to process reports
Participants were asked to estimate the average time taken to process a report. For the purposes of this survey, the time to process a report was interpreted as meaning the total number of hours of work required to prioritise an incoming report, to investigate the report, and to report back on the outcome of any resulting investigation.
Table
4 shows the average time required for each of these tasks.
Table 4: Time required for processing of reports of child sexual abuse online by law enforcement authorities
Task
|
Reports containing
known CSAM
|
Reports containing
new CSAM
|
Reports relating to
grooming
|
|
Time per report (hours)
|
Time per report (hours)
|
Time per report (hours)
|
Prioritisation of reports (time per report)
|
0.47
|
0.47
|
0.47
|
Investigation
|
57.75
|
102.27
|
89.82
|
Reporting on the outcome of the investigation
|
0.32
|
0.32
|
0.32
|
Total
|
58.54
|
103.06
|
90.61
|
Total (rounded to nearest 10 hours)
|
60
|
100
|
90
|
Information to be included in reports
In order to determine the information that a report should contain to make it actionable to law enforcement, participants were asked to indicate the importance of several types of information by categorising them under the following possible options:
·Critical – the report cannot be actioned without this information.
·Useful – the report can be actioned without this information, but it should be included if it is available.
·Not relevant – there is no need to include this information in a report.
Participants were also given the option to specify other relevant information.
Table
5 shows the percentage of respondents that categorised each type of information as critical, useful or not relevant (excluding participants who did not select an option for a given type of information).Table 5 shows the percentage of respondents that categorised each type of information as critical, useful or not relevant (excluding participants who did not select an option for a given type of information).
Table 5: percentage of respondents indicating that each type of information is critical, useful or not relevant in order to ensure that a report is actionable
Information to be included in report
|
Critical %
|
Useful %
|
Not Relevant %
|
Information relating to the provider making the report
|
|
|
|
Name of the provider
|
81%
|
19%
|
0%
|
Point of contact in service provider
|
33%
|
57%
|
10%
|
Jurisdiction in which the service provider is located
|
25%
|
50%
|
25%
|
Other information (please specify)
|
40%
|
20%
|
40%
|
General information relating to the report:
|
|
|
|
Indication of whether the report is urgent (child in imminent danger of actual sexual abuse) or not
|
62%
|
38%
|
0%
|
More detailed indication of level of urgency (please specify)
|
35%
|
41%
|
24%
|
Nature of report (e.g., CSAM images/videos, grooming, live-streaming of abuse)
|
48%
|
52%
|
0%
|
Copy of reported content
|
95%
|
5%
|
0%
|
Additional relevant content data (please specify)
|
46%
|
38%
|
15%
|
Type of service on which reported content was detected
|
67%
|
33%
|
0%
|
Date/time the reported content was detected
|
76%
|
24%
|
0%
|
Languages used in the reported content
|
29%
|
57%
|
14%
|
Technology which detected the abuse
|
14%
|
62%
|
24%
|
Traffic data
|
60%
|
40%
|
0%
|
Other information (please specify)
|
33%
|
33%
|
33%
|
Information relating to child victim(s) related to reported content:
|
|
|
|
Actual age of child victim(s)
|
48%
|
48%
|
5%
|
Estimated age of child victim(s) (if actual age unknown)
|
20%
|
75%
|
5%
|
Name of child victim(s)
|
48%
|
43%
|
10%
|
Contact information of child victim(s)
|
43%
|
52%
|
5%
|
Jurisdiction(s) in which child victim(s) are located
|
43%
|
52%
|
5%
|
Relationship between child victim and suspect
|
33%
|
67%
|
0%
|
Injuries displayed by child
|
24%
|
76%
|
0%
|
Psychological state of child
|
14%
|