Choose the experimental features you want to try

This document is an excerpt from the EUR-Lex website

Document 32024R2916

Commission Implementing Regulation (EU) 2024/2916 of 25 November 2024 laying down a standard form for the data included in the report on the processing of personal data published and reported to the competent supervisory authority and to the Commission by service providers under Regulation (EU) 2021/1232 of the European Parliament and of the Council

C/2024/7994

OJ L, 2024/2916, 26.11.2024, ELI: http://data.europa.eu/eli/reg_impl/2024/2916/oj (BG, ES, CS, DA, DE, ET, EL, EN, FR, GA, HR, IT, LV, LT, HU, MT, NL, PL, PT, RO, SK, SL, FI, SV)

Legal status of the document Date of entry into force unknown (pending notification) or not yet in force., Date of effect: 16/12/2024

ELI: http://data.europa.eu/eli/reg_impl/2024/2916/oj

European flag

Official Journal
of the European Union

EN

L series


2024/2916

26.11.2024

COMMISSION IMPLEMENTING REGULATION (EU) 2024/2916

of 25 November 2024

laying down a standard form for the data included in the report on the processing of personal data published and reported to the competent supervisory authority and to the Commission by service providers under Regulation (EU) 2021/1232 of the European Parliament and of the Council

(Text with EEA relevance)

THE EUROPEAN COMMISSION,

Having regard to the Treaty on the Functioning of the European Union,

Having regard to Regulation (EU) 2021/1232 of the European Parliament and of the Council of 14 July 2021 on a temporary derogation from certain provisions of Directive 2002/58/EC as regards the use of technologies by providers of number-independent interpersonal communications services for the processing of personal and other data for the purpose of combating online child sexual abuse (1), and in particular Article 3(4) thereof,

After consulting the Committee established by Article 9a(1) of Regulation (EU) 2021/1232,

Whereas:

(1)

The information to be included in the report on the processing of personal data published and reported to the competent supervisory authority and to the Commission is referred to in Article 3(1), point (g), subpoint (vii), of Regulation (EU) 2021/1232.

(2)

In order to improve reporting and to ensure that data is collected in a uniform manner, providers of number-independent interpersonal communications services should use the standard form laid out in the Annex to this Regulation when complying with their reporting obligations pursuant to Regulation (EU) 2021/1232,

HAS ADOPTED THIS REGULATION:

Article 1

Standard form for the reports

Providers of number-independent interpersonal communications services shall use the standard form set out in the Annex to this Regulation when they publish and submit to the competent supervisory authority and to the Commission a report on the processing of personal data under Article 3(1), point (g), subpoint (vii), of Regulation (EU) 2021/1232.

Article 2

Entry into force

This Regulation shall enter into force on the twentieth day following that of its publication in the Official Journal of the European Union.

This Regulation shall be binding in its entirety and directly applicable in all Member States.

Done at Brussels, 25 November 2024.

For the Commission

The President

Ursula VON DER LEYEN


(1)   OJ L 274, 30.7.2021, p. 41, ELI: http://data.europa.eu/eli/reg/2021/1232/oj.


ANNEX

Category according to Article 3(1)(g)(vii) of Regulation (EU) 2021/1232

Subcategory

Description

1)

the type and volumes of data processed

Specific number-independent interpersonal communications service concerned

 

Metadata related to the users who are parts of the online exchange

i.e. any data related to the users who are parts of the online exchange and their accounts that is not content data.

Yes/no.

If yes, specify all types of data processed (e.g. user name, user identification number, IP address, Internet protocol, network port number, location, etc.) and the volume

Content data of the online exchange

Yes/no.

If yes, specify the below

Number of images processed in relation to EU users

 

Number of images processed in relation to non-EU users

 

Number of videos processed in relation to EU users

 

Number of videos processed in relation to non-EU users

 

Number of other files processed in relation to EU users

If yes, specify what type of files (e.g. pdfs, documents, gifs, audio files)

Number of other files processed in relation to non-EU users

If yes, specify what type of files (e.g. pdfs, documents, gifs, audio files)

Number of bytes of text processed for detection of solicitation of children in relation to EU users

 

Number of bytes of text processed for detection of solicitation of children in relation to non-EU users

 

Other information of relevance to the types and volumes of data processed

 

2)

the specific ground relied on for the processing pursuant to Regulation (EU) 2016/679

 

Where the claimed legal ground is Article 6(1)(c) or (e) GDPR, please indicate which Union or national law is relied upon pursuant to Article 6(3) GDPR.

3)

the ground relied on for transfers of personal data outside the Union pursuant to Chapter V of Regulation (EU) 2016/679, where applicable

 

Please specify in what type of cases a transfer outside the Union has taken place and for which purpose, and which legal grounds for transfer have been applied under Chapter V of the GDPR.

4)

the number of cases of online child sexual abuse identified

 

 

Known CSAM

Number of reports concerning known CSAM in relation to EU users

Known CSAM refers to material that has been confirmed as constituting online child sexual abuse material

Number of images of known CSAM reported in relation to EU users

 

Number of videos of known CSAM reported in relation to EU users

 

Number of other files of known CSAM reported in relation to EU users

If yes, specify what type of files

Number of user accounts in the EU reported as sending at least one content item of known CSAM

 

Number of user accounts in the EU reported as receiving at least one content item of known CSAM

 

New CSAM

Number of reports concerning possible new CSAM in relation to EU users

Possible new CSAM refers to reported files other than known CSAM

Number of images of possible new CSAM reported in relation to EU users

 

Number of videos of possible new CSAM reported in relation to EU users

 

Number of other files of possible new CSAM reported in relation to EU users

If yes, specify what type of files

Number of user accounts in the EU reported as sending at least one content item of possible new CSAM

 

Number of user accounts in the EU reported as receiving at least one content item of possible new CSAM

 

Solicitation

Number of reports of possible solicitation of children in relation to EU users

 

Number of user accounts in the EU reported as possibly soliciting a child

 

Number of user accounts reported as possibly soliciting a child in the EU

 

5)

the number of cases in which a user has lodged a complaint with the internal redress mechanism or with a judicial authority and the outcome of such complaints

 

 

Known CSAM

Number of content items removed as constituting known CSAM

Content items

Number of complaints lodged with the internal mechanism against the removal of the content item for constituting known CSAM

Reasons for the complaints lodged with the internal mechanism against the removal of the content item for constituting known CSAM (optional)

Number of content items removed as constituting known CSAM that were restored after review following a complaint lodged with the internal mechanism

Average time needed to take the decision to restore or keep the removal of items initially removed as constituting known CSAM, following a complaint lodged with the internal mechanism (optional)

Number of complaints lodged with the judicial authority against the removal of the content item for constituting known CSAM

Reasons for the complaints lodged with the judicial authority against the removal of the content item for constituting known CSAM (optional)

Number of content items removed as constituting known CSAM that were restored after review following a complaint lodged with the judicial authority

Average time needed to comply with the judicial decision to restore or keep the removal of items initially removed as constituting known CSAM, following a complaint lodged with the judicial authority (optional)

Number of user accounts in the EU suspended for having shared known CSAM

User accounts in the EU

Number of complaints lodged with the internal mechanism against the suspension of the user account in the EU for having shared known CSAM

Reasons for the complaints lodged with the internal mechanism against the suspension of the user account in the EU for having shared known CSAM (optional)

Number of user accounts in the EU suspended for having shared known CSAM that were restored after review following a complaint lodged with the internal mechanism

Average time needed to take the decision to restore or keep the suspension of the user accounts in the EU initially suspended as having shared known CSAM, following a complaint lodged with the internal mechanism (optional)

Number of complaints lodged with the judicial authority against the suspension of the user account in the EU for having shared known CSAM

Reasons for the complaints lodged with the judicial authority against the suspension of the user account in the EU for having shared known CSAM (optional)

Number of user accounts in the EU suspended for having shared known CSAM that were restored after review following a complaint lodged with the judicial authority

Average time needed to comply with the judicial decision to restore or keep the suspension of the user accounts in the EU initially suspended as having shared known CSAM, following a complaint lodged with the judicial authority (optional)

New CSAM

Number of content items removed as constituting possible new CSAM

Content items

Number of complaints lodged with the internal mechanism against the removal of the content item for constituting possible new CSAM

Reasons for the complaints lodged with the internal mechanism against the removal of the content item for constituting possible new CSAM (optional)

Number of content items removed as constituting possible new CSAM that were restored after review following a complaint lodged with the internal mechanism

Average time needed to take the decision to restore or keep the removal of items initially removed as constituting possible new CSAM, following a complaint lodged with the internal mechanism (optional)

Number of complaints lodged with the judicial authority against the removal of the content item for constituting possible new CSAM

Reasons for the complaints lodged with the judicial authority against the removal of the content item for constituting possible new CSAM (optional)

Number of content items removed as constituting possible new CSAM that were restored after review following a complaint lodged with the judicial authority

Average time needed to comply with the judicial decision to restore or keep the removal of items initially removed as constituting possible new CSAM, following a complaint lodged with the judicial authority (optional)

Number of user accounts in the EU suspended for having shared possible new CSAM

User accounts in the EU

Number of complaints lodged with the internal mechanism against the suspension of the user account in the EU for having shared possible new CSAM

Reasons for the complaints lodged with the internal mechanism against the suspension of the user account in the EU for having shared possible new CSAM (optional)

Number of user accounts in the EU suspended for having shared possible new CSAM that were restored after review following a complaint lodged with the internal mechanism

Average time needed to take the decision to restore or keep the suspension of the user accounts in the EU initially suspended as having shared possible new CSAM, following a complaint lodged with the internal mechanism (optional)

Number of complaints lodged with the judicial authority against the suspension of the user account in the EU for having shared possible new CSAM

Reasons for the complaints lodged with the judicial authority against the suspension of the user account in the EU for having shared possible new CSAM (optional)

Number of user accounts in the EU suspended for having shared possible new CSAM that were restored after review following a complaint lodged with the judicial authority

Average time needed to comply with the judicial decision to restore or keep the suspension of the user accounts in the EU initially suspended as having shared possible new CSAM, following a complaint lodged with the judicial authority (optional)

Solicitation

Number of user accounts in the EU suspended for having solicited a child

User accounts in the EU

Number of complaints lodged with the internal mechanism against the suspension of the user account in the EU for having solicited a child

Reasons for the complaints lodged with the internal mechanism against the suspension of the user account in the EU for having solicited a child (optional)

Number of user accounts in the EU suspended for having solicited a child that were restored after review following a complaint lodged with the internal mechanism

Average time needed to take the decision to restore or keep the suspension of the user account in the EU initially suspended for having solicited a child, following a complaint lodged with the internal mechanism (optional)

Number of complaints lodged with the judicial authority against the suspension of the user account in the EU for soliciting a child

Reasons for the complaints lodged with the judicial authority against the suspension of the user account in the EU for having solicited a child (optional)

Number of user accounts in the EU suspended for having solicited a child that were restored after review following a complaint lodged with the judicial authority

Average time needed to comply with the judicial decision to restore or keep the suspension of the user account in the EU initially suspended for having solicited a child, following a complaint lodged with the judicial authority (optional)

6)

the numbers and ratios of errors (false positives) of the different technologies used

 

Error rate: number of pieces of content flagged automatically as possible online CSA which are not online CSA upon human review, divided by the number of pieces of content flagged automatically as possible online CSA

Please specify for each of the different technologies used.

Known CSAM

A1: Number of content items automatically flagged as constituting known CSAM

 

B1: Number of content items automatically flagged as constituting known CSAM which are not known CSAM upon human review

 

Error rate: B1/A1 (%)

 

C1: Number of content items automatically flagged as constituting known CSAM that are subject to human review

 

Other relevant findings

 

New CSAM

A2: Number of content items automatically flagged as constituting new CSAM

 

B2: Number of content items automatically flagged as constituting new CSAM which are not CSAM upon human review

 

Error rate: B2/A2 (%)

 

C2: Number of content items automatically flagged as constituting new CSAM that are subject to human review

 

Other relevant findings

 

Solicitation

A3: Number of user accounts in the EU automatically flagged as either having solicited a child or being solicited as a child

 

B3: Number of user accounts in the EU automatically flagged as either having solicited a child or being solicited as a child which were not involved in solicitation upon human review

 

Error rate: B3/A3 (%)

 

C3: Number of user accounts in the EU automatically flagged as either having solicited a child or being solicited as a child that are subject to human review

 

Other relevant findings

 

7)

the measures applied to limit the error rate and the error rate achieved

 

 

Known CSAM

Measures to check the quality of existing hashes (please specify)

Indicators (e.g. hashes)

Measures to vet hashes before they are added to the database (please specify)

Other measures to limit the error rate through actions on the hashes (please specify)

Implementation plan vetted by independent third party

Implementation of the detection technology

Other measures to limit the error rates through actions on the deployment of the detection technology (please specify)

Systematic human review of every content item flagged as known CSAM prior to reporting

Human review

Human review of sample content items flagged as known CSAM prior to reporting

Training policies of human reviewers (please specify, e.g. the type and duration of training before start working, the periodicity and type of refresh trainings, etc.)

Measures to ensure periodic quality control assessments of human reviewers and the verdicts that are applied (please specify)

Measures to ensure feedback from NCMEC and/or from other organisations acting in the public interest against child sexual abuse (please specify)

Other measures

Measures to ensure feedback from law enforcement (please specify)

Measures to ensure feedback from the outcome of a complaint in the context of the internal redress mechanism or a complaint lodged with a judicial authority (please specify)

Other measures to limit the error rate (please specify)

Error rate achieved following the implementation of the measures to limit the error rate (if applicable, i.e. if new measures have been introduced)

Error rate achieved

New CSAM

Measures to check the quality of existing AI classifiers (please specify)

Indicators (e.g. AI classifiers)

Measures to vet AI classifiers before they are added to the database (please specify)

Other measures to limit the error rate through actions on the AI classifiers (please specify)

Implementation plan vetted by independent third party

Implementation of the detection technology

Other measures to limit the error rates through actions on the deployment of the detection technology (please specify)

Systematic human review of every content item flagged as possible new CSAM prior to reporting

Human review

Human review of sample content items flagged as possible new CSAM prior to reporting

Training policies of human reviewers (please specify, e.g. the type and duration of training before start working, the periodicity and type of refresh trainings, etc.)

Measures to ensure periodic quality control assessments of human reviewers and the verdicts that are applied (please specify)

Measures to ensure feedback from NCMEC and/or from other organisations acting in the public interest against child sexual abuse (please specify)

Other measures

Measures to ensure feedback from law enforcement (please specify)

Measures to ensure feedback from the outcome of a complaint in the context of the internal redress mechanism or a complaint lodged with a judicial authority (please specify)

Other measures to limit the error rate (please specify)

Error rate achieved following the implementation of the measures to limit the error rate (if applicable, i.e. if new measures have been introduced)

Error rate achieved

Solicitation

Measures to check the quality of existing AI classifiers (please specify)

Indicators (e.g. AI classifiers)

Measures to vet AI classifiers before they are added to the database (please specify)

Other measures to limit the error rate through actions on the AI classifiers (please specify)

Implementation plan vetted by independent third party

Implementation of the detection technology

Other measures to limit the error rates through actions on the deployment of the detection technology (please specify)

Systematic human review of every content item flagged as possible solicitation prior to reporting

Human review

Human review of sample content items flagged as possible solicitation prior to reporting

Training policies of human reviewers (please specify, e.g. the type and duration of training before start working, the periodicity and type of refresh trainings, etc.)

Measures to ensure periodic quality control assessments of human reviewers and the verdicts that are applied (please specify)

Measures to ensure feedback from NCMEC and/or from other organisations acting in the public interest against child sexual abuse (please specify)

Other measures

Measures to ensure feedback from law enforcement (please specify)

Measures to ensure feedback from the outcome of a complaint in the context of the internal redress mechanism or a complaint lodged with a judicial authority (please specify)

Other measures to limit the error rate (please specify)

Error rate achieved following the implementation of the measures to limit the error rate (if applicable, i.e. if new measures have been introduced)

Error rate achieved

8)

the retention policy and the data protection safeguards applied pursuant to Regulation (EU) 2016/679

 

 

Retention policies

Retention policy for content items identified as online CSA (please specify)

Please specify for each category the length of the relevant retention periods.

Retention policy for non-content data related to reports of online CSA, including after a possible deactivation of the account by the user (please specify)

 

Retention policy for data related to complaints and policy violations

 

Other relevant retention policies (please specify)

 

Data protection safeguards

Use of de-identification or pseudonymisation techniques and anonymisation of data

Specify the techniques and in which instances they are deployed

Use of industry standard encryption (algorithms and protocols) for data in transit between privately owned infrastructure and public networks

 

Implementation of data governance strategies/comprehensive privacy programmes (please specify)

E.g. Internal data access restrictions, usage of Access Control Lists, confidentiality obligations to those with access, etc.

Procedures to review of anonymisation and data governance strategies (please specify)

 

Procedures to maintain security incident response plans for monitoring, detecting, and handling any possible security vulnerabilities and incidents across infrastructure (please specify)

 

Other technical and organisational measures to ensure the security of the data (please specify)

 

Internal Redress mechanism

Please specify whether: i) you inform the individual on the facts relevant for your decision, ii) the procedures in place to assess possible user’s enquiries regarding your decision, iii) a dedicated communication channel with users exists, iv) users are informed about the completion of the assessment and v) whether additional information is sent to the recipients of Article (3)(1)(h)(i) of Regulation (EU) 2021/1232 based on the outcome of the assessment of user’s complaint

Right to access users’ data

Please specify whether, how and when users are being given access to their data in case their accounts are temporarily or permanently suspended or deleted as a result of identified CSAM or solicitation

Other safeguards (please specify)

 

9)

the names of the organisations acting in the public interest against child sexual abuse with which data has been shared pursuant to this Regulation

National Centre for Missing and Exploited Children (NCMEC)

 

EU Centre to prevent and combat child sexual abuse

Only applicable upon establishment of the EU Centre

Other (please specify)

 


ELI: http://data.europa.eu/eli/reg_impl/2024/2916/oj

ISSN 1977-0677 (electronic edition)


Top