EUR-Lex Access to European Union law

Back to EUR-Lex homepage

This document is an excerpt from the EUR-Lex website

Document 52020IP0273

European Parliament resolution of 20 October 2020 with recommendations to the Commission on a Digital Services Act: adapting commercial and civil law rules for commercial entities operating online (2020/2019(INL))

OJ C 404, 6.10.2021, p. 31–52 (BG, ES, CS, DA, DE, ET, EL, EN, FR, HR, IT, LV, LT, HU, MT, NL, PL, PT, RO, SK, SL, FI, SV)

6.10.2021   

EN

Official Journal of the European Union

C 404/31


P9_TA(2020)0273

Digital Services Act: adapting commercial and civil law rules for commercial entities operating online

European Parliament resolution of 20 October 2020 with recommendations to the Commission on a Digital Services Act: adapting commercial and civil law rules for commercial entities operating online (2020/2019(INL))

(2021/C 404/02)

The European Parliament,

having regard to Article 225 of the Treaty on the Functioning of the European Union,

having regard to Article 11 of the Charter of Fundamental Rights of the European Union and Article 10 of the European Convention on Human Rights,

having regard to Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services (1),

having regard to Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC (2),

having regard to Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (3) (hereinafter referred to as the ‘General Data Protection Regulation’),

having regard to the Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (4),

having regard to Directive 2008/52/EC of the European Parliament and of the Council of 21 May 2008 on certain aspects of mediation in civil and commercial matters (5),

having regard to the proposal for a Regulation of the European Parliament and of the Council of 6 June 2018 establishing the Digital Europe Programme for the period 2021-2027 (COM(2018)0434),

having regard to the Commission Recommendation (EU) 2018/334 of 1 March 2018 on measures to effectively tackle illegal content online (6),

having regard to the Convention on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (7) and the Convention on the Recognition and Enforcement of Foreign Arbitral Awards, signed on 10 June 1958 in New York,

having regard to its resolution of 3 October 2018 on distributed ledger technologies and blockchains: building trust with disintermediation (8),

having regard to the Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions of 19 February 2020 on A European strategy for data (COM(2020)0066),

having regard to the Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions of 19 February 2020 on Shaping Europe’s digital future (COM(2020)0067),

having regard to the Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions of 25 May 2016 on Online Platforms and the Digital Single Market — Opportunities and Challenges for Europe (COM(2016)0288),

having regard to the European added value assessment study carried out by the European Parliamentary Research Service, entitled ‘Digital Services Act: European added value assessment’ (9),

having regard to Rules 47 and 54 of its Rules of Procedure,

having regard to the opinions of the Committee on the Internal Market and Consumer Protection and of the Committee on Culture and Education,

having regard to the report of the Committee on Legal Affairs (A9-0177/2020),

A.

whereas digital services, being a cornerstone of the Union’s economy and the livelihood of a large number of its citizens, need to be regulated in a way that guarantees fundamental rights and other rights of citizens while supporting development and economic progress, the digital environment and fostering trust online, taking into account the interests of users and all market participants, including SMEs and start-ups;

B.

whereas some rules regarding online content-sharing providers and audiovisual media services have recently been updated, notably by Directive (EU) 2018/1808 and Directive (EU) 2019/790, a number of key civil and commercial law aspects have not been addressed satisfactorily in Union or national law, and whereas the importance of this issue has been accentuated by rapid and accelerating development over the last decades in the field of digital services, in particular the emergence of new business models, technologies and social realities; whereas in this context, a comprehensive updating of the essential provisions of civil and commercial law applicable to online commercial entities is required;

C.

whereas some businesses offering digital services enjoy, due to strong data-driven network effects, significant market power that enables them to impose their business practices on users and makes it increasingly difficult for other players, especially start-ups and SMEs, to compete and for new businesses to even enter the market;

D.

whereas ex-post competition law enforcement alone cannot effectively address the impact of the market power of certain online platforms, including on fair competition in the Digital Single Market;

E.

whereas content hosting platforms evolved from involving the mere display of content into sophisticated organisms and market players, in particular social networks that harvest and exploit usage data; whereas users have legitimate grounds to expect fair terms with respect to access, transparency, pricing and conflict resolution for the usage of such platforms and for the use that platforms make of the users’ data; whereas transparency can contribute to significantly increasing trust in digital services;

F.

whereas content hosting platforms may determine what content is shown to their users, thereby profoundly influencing the way we obtain and communicate information, to the point that content hosting platforms have de facto become public spaces in the digital sphere; whereas public spaces must be managed in a manner that protects public interests, respects fundamental rights and the civil law rights of the users in particular the right to freedom of expression and information;

G.

whereas upholding the law in the digital world does not only involves effective enforcement of fundamental rights, in particular freedom of expression and information, privacy, safety and security, non-discrimination, respect for property and intellectual property rights, but also access to justice and due process; whereas delegating decisions regarding the legality of content or of law enforcement powers to private companies undermines transparency and due process, leading to a fragmented approach; whereas a fast-track legal procedure with adequate guarantees is therefore required to ensure that effective remedies exist;

H.

whereas automated tools are currently unable to reliably differentiate illegal content from content that is legal in a given context and that therefore mechanisms, for the automatic detection and removal of content can raise legitimate legal concerns, in particular as regards possible restrictions of freedom of expression and information, protected under Article 11 of the Charter of Fundamental Rights of the European Union; whereas the use of automated mechanisms should, therefore, be proportionate, covering only justified cases, and following transparent procedures;

I.

whereas Article 11 of the Charter of Fundamental Rights of the European Union also protects the freedom and pluralism of the media, which are increasingly dependent on online platforms to reach their audiences;

J.

whereas digital services are used by the majority of Europeans on a daily basis, but are subject to an increasingly wide set of rules across the Union leading to significant fragmentation on the market and consequently legal uncertainty for European users and services operating across borders; whereas the civil law regimes governing content hosting platforms’ practices in content moderation are based on certain sector-specific provisions at Union and national level with notable differences in the obligations imposed and in the enforcement mechanisms of the various civil law regimes deployed; whereas this situation has led to a fragmented set of rules for the Digital Single Market, which requires a response at Union level;

K.

whereas the current business model of certain content hosting platforms is to promote content that is likely to attract the attention of users and therefore generate more profiling data in order to offer more effective targeted advertisements and thereby increase profit; whereas this profiling coupled with targeted advertisement can lead to the amplification of content geared towards exploiting emotions, often encouraging and facilitating sensationalism in news feed and recommendation systems, resulting in the possible manipulation of users;

L.

whereas offering users contextual advertisements requires less user data than targeted behavioural advertising and is thus less intrusive;

M.

whereas the choice of algorithmic logic behind recommendation systems, comparison services, content curation or advertisement placements remains at the discretion of the content hosting platforms with little possibility for public oversight, which raises accountability and transparency concerns;

N.

whereas content hosting platforms with significant market power make it possible for their users to use their profiles to log into third-party websites, thereby allowing them to track their activities even outside their own platform environment, which constitutes a competitive advantage in access to data for content curation algorithms;

O.

whereas so-called smart contracts, which are based on distributed ledger technologies, including blockchains, that enable decentralised and fully traceable record-keeping and self-execution to occur, are being used in a number of areas without a proper legal framework; whereas there is uncertainty concerning the legality of such contracts and their enforceability in cross-border situations;

P.

whereas the non-negotiable terms and conditions of platforms often indicate both applicable law and competent courts outside the Union, which may impede access to justice; whereas Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (10) lays down rules on jurisdiction; whereas the General Data Protection Regulation, clarifies the data subject’s right to private enforcement action directly against the controller or processor, regardless of whether the processing takes place in the Union or not and regardless whether the controller is established in the Union or not; whereas Article 79 of the General Data Protection Regulation stipulates that proceedings shall be brought before the courts of the Member State where the controller or processor has an establishment or, alternatively where the data subject has his or her habitual residence;

Q.

whereas access to and mining of non-personal data is an important factor in the growth of the digital economy; whereas appropriate legal standards and data protection safeguards regarding the interoperability of data can, by removing lock-in effects, play an important part in ensuring fair market conditions;

R.

whereas it is important to assess the possibility of tasking a European entity with the responsibility of ensuring a harmonised approach to the implementation of the Digital Services Act across the Union, facilitating coordination at national level as well as addressing the new opportunities and challenges, in particular those of a cross-border nature, arising from ongoing technological developments;

Digital Services Act

1.

Requests that the Commission submit without undue delay a set of legislative proposals constituting a Digital Services Act with an adequate material, personal and territorial scope, defining key concepts and including the recommendations as set out in the Annex to this resolution; is of the view that, without prejudice to detailed aspects of the future legislative proposals, Article 114 of the Treaty on the Functioning of the European Union should be the legal basis;

2.

Proposes that the Digital Services Act include a regulation that establishes contractual rights as regards content management, lays down transparent, fair, binding and uniform standards and procedures for content moderation, and guarantees accessible and independent recourse to judicial redress; stresses that legislative proposals should be evidence-based and seek to remove current and prevent potentially new unjustified barriers in the supply of digital services by online platforms while enhancing the protection of consumers and citizens; believes that the legislative proposals should aim at achieving sustainable and smart growth, address technological challenges, and ensure that the Digital Single Market is fair and safe for everyone;

3.

Further suggests that the measures proposed for content moderation only apply to illegal content rather than content that is merely harmful; suggests, to this end, that the regulation include universal criteria to determine the market power of platforms in order to provide a clear definition of what constitutes a platform with significant market power and thereby determine whether certain content hosting platforms that do not hold significant market power can be exempted from certain provisions; underlines that the framework established by the Digital Services Act should be manageable for small businesses, SMEs and start-ups and should therefore include proportionate obligations for all sectors;

4.

Proposes that the Digital Services Act impose an obligation on digital service providers who are established outside the Union to designate a legal representative for the interest of users within the Union, to whom requests could be addressed in order, for example, to allow for consumer redress in the case of false or misleading advertisements, and to make the contact information of that representative visible and accessible on the website of the digital service provider;

Rights as regards content moderation

5.

Stresses that the responsibility for enforcing the law must rest with public authorities; considers that the final decision on the legality of user-generated content must be made by an independent judiciary and not a private commercial entity;

6.

Insists that the regulation must prohibit content moderation practices that are discriminatory or entail exploitation and exclusion, especially towards the most vulnerable, and must always respect the fundamental rights and freedoms of users, in particular their freedom of expression;

7.

Stresses the necessity to better protect consumers by providing reliable and transparent information on examples of malpractice, such as the making of misleading claims and scams;

8.

Recommends that the application of the regulation should be closely monitored by a European entity tasked with ensuring compliance by content hosting platforms with the provisions of the regulation, in particular by monitoring compliance with the standards laid down for content management on the basis of transparency reports and monitoring algorithms employed by content hosting platforms for the purpose of content management; calls on the Commission to assess the options of appointing an existing or new European Agency or European body or of coordinating itself a network of national authorities to carry out these tasks (hereinafter referred to as ‘the European entity’);

9.

Suggests that content hosting platforms regularly submit comprehensive transparency reports based on a consistent methodology and assessed on the basis of relevant performance indicators, including on their content policies and the compliance of their terms and conditions with the provisions of the Digital Services Act, to the European entity; further suggests that content hosting platforms publish and make available in an easy and accessible manner those reports as well as their content management policies on a publicly accessible database;

10.

Calls for content hosting platforms with significant market power to evaluate the risk that their content management policies of legal content pose to society, in particular with regard to their impact on fundamental rights, and to engage in a biannual dialogue with the European entity and the relevant national authorities on the basis of a presentation of transparency reports;

11.

Recommends that the Member States provide for independent dispute settlement bodies, tasked with settling disputes regarding content moderation; takes the view that in order to protect anonymous publications and the general interest, not only the user who uploaded the content that is the subject of a dispute but also a third party, such as an ombudsperson, with a legitimate interest in acting should be able to challenge content moderation decisions; affirms the right of users to further recourse to justice;

12.

Takes the firm position that the Digital Services Act must not oblige content hosting platforms to employ any form of fully automated ex-ante controls of content unless otherwise specified in existing Union law, and considers that mechanisms voluntarily employed by platforms must not lead to ex-ante control measures based on automated tools or upload-filtering of content and must be subject to audits by the European entity to ensure that there is compliance with the Digital Services Act;

13.

Stresses that content hosting platforms must be transparent in the processing of algorithms and of the data used to train them;

Rights as regards content curation, data and online advertisements

14.

Considers that the user-targeted amplification of content based on the views or positions presented in such content is one of the most detrimental practices in the digital society, especially in cases where the visibility of such content is increased on the basis of previous user interaction with other amplified content and with the purpose of optimising user profiles for targeted advertisements; is concerned that such practices rely on pervasive tracking and data mining; calls on the Commission to analyse the impact of such practices and take appropriate legislative measures;

15.

Is of the view that the use of targeted advertising must be regulated more strictly in favour of less intrusive forms of advertising that do not require any tracking of user interaction with content and that being shown behavioural advertising should be conditional on users’ freely given, specific, informed and unambiguous consent;

16.

Notes the existing provisions addressing targeted advertising in the General Data Protection Regulation and Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications) (11);

17.

Recommends, therefore, that the Digital Services Act set clear boundaries and introduce transparency rules as regards the terms for accumulation of data for the purpose of offering targeted advertisements as well as regards the functioning and accountability of such targeted advertisement, especially when data are tracked on third-party websites; maintains that new measures establishing a framework for Platform-to-Consumers relations are needed as regards transparency provisions on advertising, digital nudging and preferential treatment; invites the Commission to assess options for regulating targeted advertising, including a phase-out leading to a prohibition;

18.

Stresses that in line with the principle of data minimisation and in order to prevent unauthorised disclosure, identity theft and other forms of abuse of personal data, the Digital Services Act should provide for the right to use digital services anonymously wherever technically possible; calls on the Commission to require content hosting platforms to verify the identity of those advertisers with which they have a commercial relationship to ensure accountability of advertisers in the event content promoted is found to be illegal; recommends therefore that the Digital Services Act include legal provisions preventing platforms from commercially exploiting third-party data in situations of competition with those third parties;

19.

Regrets the existing information asymmetry between content hosting platforms and public authorities and calls for a streamlined exchange of necessary information; stresses that in line with the case law on communications metadata, public authorities must be given access to a user’s metadata only to investigate suspects of serious crime and with prior judicial authorisation;

20.

Recommends that providers which support a single sign-on service with significant market power should be required to also support at least one open and decentralised identity system based on a non-proprietary framework; asks the Commission to propose common Union standards for national systems provided by Member States, especially as regards data protection standards and cross-border interoperability;

21.

Calls on the Commission to assess the possibility of defining fair contractual conditions to facilitate data sharing and increase transparency with the aim of addressing imbalances in market power; suggests, to this end, exploring options to facilitate the interoperability, interconnectivity and portability of data; points out that data sharing should be accompanied by adequate and appropriate safeguards including effective anonymization of personal data;

22.

Recommends that the Digital Services Act require platforms with significant market power to provide an application programming interface, through which third-party platforms and their users can interoperate with the main functionalities and users of the platform providing the application programming interface, including third-party services designed to enhance and customise the user experience, especially through services that customise privacy settings as well as content curation preferences; suggests that platforms publicly document all application programming interfaces they make available for the purpose of allowing for the interoperability and interconnectivity of services;

23.

Is strongly of the view, on the other hand, that platforms with significant market power providing an application programming interface must not share, retain, monetise or use any of the data they receive from third-party services;

24.

Stresses that interoperability and interconnectivity obligations must not limit, hinder or delay the ability of content hosting platforms to fix security issues, nor should the need to fix security issues lead to an undue suspension of the application programming interface providing interoperability and interconnectivity;

25.

Recalls that the provisions on interoperability and interconnectivity must respect all relevant data protection laws; recommends, in this respect, that platforms be required by the Digital Services Act to ensure the technical feasibility of the data portability provisions laid down in Article 20(2) of the General Data Protection Regulation;

26.

Calls for content hosting platforms to give users a real choice as to whether or not to give prior consent to being shown targeted advertising based on the user’s prior interaction with content on the same content hosting platform or on third-party websites; underlines that this choice must be presented in a clear and understandable way and its refusal must not lead to access to the functionalities of the platform being disabled; stresses that consent in targeted advertising must not be considered as freely given and valid if access to the service is made conditional on data processing; reconfirms that the Directive 2002/58/EC makes targeted advertising subject to an opt-in decision and that it is otherwise prohibited; notes that since the online activities of an individual allow for deep insights into their behaviour and make it possible to manipulate them, the general and indiscriminate collection of personal data concerning every use of a digital service interferes disproportionately with the right to privacy; confirms that users have a right not to be subject to pervasive tracking when using digital services;

27.

Asks the Commission to ensure that, in the same spirit, consumers can still use a connected device for all its functions, even if consumers withdraw or do not give their consent to share non-operational data with the device manufacturer or third parties; reiterates the need for transparency in contract terms and conditions regarding the possibility and scope of data sharing with third parties;

28.

Further calls for users to be guaranteed an appropriate degree of transparency and influence over the criteria according to which content is curated and made visible for them; affirms that this should also include the option to opt out from any content curation other than chronological order; points out that application programming interfaces provided by platforms should allow users to have content curated by software or services of their choice;

29.

Underlines the importance for the Digital Services Act to prove legally sound and effective protection of children in the online environment, whilst refraining from imposing general monitoring or filtering obligations and ensuring full coordination and avoiding duplication with the General Data Protection Regulation and with the Audiovisual Media Services Directive.

30.

Recalls that paid advertisements or paid placement of sponsored content should be identified in a clear, concise and intelligent manner; suggests that platforms should disclose the origin of paid advertisements and sponsored content; suggests, to this end, that content hosting platforms publish all sponsored content and advertisements and make them clearly visible to their users in an advertising archive that is publicly accessible, indicating who has paid for them, and, if applicable, on behalf of whom; stresses that this includes both direct and indirect payments or any other remuneration received by service providers;

31.

Believes that, if relevant data show a significant gap in misleading advertising practices and enforcement between platforms based in the Union-based and platforms based in third countries, it is reasonable to consider further options to ensure compliance with the laws in force within the Union; stresses the need for a level playing field between advertisers from the Union and advertisers from third countries;

Provisions regarding terms and conditions, smart contracts and blockchains, and private international law

32.

Notes the rise of so-called smart contracts such as those based on distributed ledger technologies without a clear legal framework;

33.

Calls on the Commission to assess the development and use of distributed ledger technologies, including blockchain and, in particular, of smart contracts, provide guidance to ensure legal certainty for business and consumers, in particular regarding questions of legality, enforcement of smart contracts in cross border situations, and notarisation requirements where applicable, and make proposals for the appropriate legal framework;

34.

Underlines that the fairness and compliance with fundamental rights standards of terms and conditions imposed by intermediaries on the users of their services must be subject to judicial review; stresses, that terms and conditions unduly restricting users’ fundamental rights, such as the right to privacy and to freedom of expression, should not be binding;

35.

Requests that the Commission examine modalities to ensure appropriate balance and equality between the parties to smart contracts by taking into account the private concerns of the weaker party or public concerns such as those related to cartel agreements; emphasises the need to ensure that the rights of creditors in insolvency and restructuring procedures are respected; strongly recommends that smart contracts include mechanisms that can halt and reverse their execution and related payments;

36.

Requests the Commission to, in particular, update its existing guidance document on Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights (12) in order to clarify whether it considers smart contracts to fall within the exemption in point (l) of Article 3(3) of that Directive, and, if so, under which circumstances, and to clarify the issue of the right of withdrawal;

37.

Stresses the need for blockchain technologies, and smart contracts in particular, to be utilised in accordance with antitrust rules and requirements, including those prohibiting cartel agreements or concerted practices;

38.

Considers that standard terms and conditions should not prevent effective access to justice in Union courts or disenfranchise Union citizens or businesses; calls on the Commission to assess whether the protection of access rights to data under private international law is uncertain and leads to disadvantages for Union citizens and businesses;

39.

Emphasises the importance of ensuring that the use of digital services in the Union is fully governed by Union law under the jurisdiction of Union courts;

40.

Concludes further that legislative solutions to these issues ought to be found at Union level if action at the international level does not seem feasible, or if there is a risk of such action taking too long to come to fruition;

41.

Stresses that service providers established in the Union must not be required to remove or disable access to information that is legal in their country of origin;

o

o o

42.

Instructs its President to forward this resolution and the accompanying detailed recommendations to the Commission and the Council.

(1)  OJ L 186, 11.7.2019, p. 57.

(2)  OJ L 130, 17.5.2019, p. 92.

(3)  OJ L 119, 4.5.2016, p. 1.

(4)  OJ L 95, 15.4.2010, p. 1.

(5)  OJ L 136, 24.5.2008, p. 3.

(6)  OJ L 63, 6.3.2018, p. 50.

(7)  OJ L 339, 21.12.2007, p. 3.

(8)  OJ C 11, 13.1.2020, p. 7.

(9)  https://www.europarl.europa.eu/RegData/etudes/STUD/2020/654180/EPRS_STU(2020)654180_EN.pdf

(10)  OJ L 351, 20.12.2012, p. 1

(11)  OJ L 201, 31.7.2002, p. 37.

(12)  OJ L 304, 22.11.2011, p. 64.


ANNEX TO THE RESOLUTION:

DETAILED RECOMMENDATIONS AS TO THE CONTENT OF THE PROPOSAL REQUESTED

A.   PRINCIPLES AND AIMS OF THE PROPOSAL REQUESTED

THE KEY PRINCIPLES AND AIMS OF THE PROPOSAL:

The proposal sets out both acts that should be included in the Digital Services Act and acts that are ancillary to the Digital Services Act.

The proposal aims to strengthen civil and commercial law rules applicable to commercial entities operating online with respect to digital services.

The proposal aims to strengthen and bring clarity on the contractual rights of users in relation to content moderation and curation.

The proposal aims to further address inadmissible and unfair terms and conditions used for the purpose of digital services.

The proposal addresses the issue of aspects of data collection being in contravention of fair contractual rights of users as well as data protection and online confidentiality rules.

The proposal addresses the importance of fair implementation of the rights of users as regards interoperability and portability.

The proposal raises the importance of private international law rules that provide legal clarity on the non-negotiable terms and conditions used by online platforms, as well as of ensuring the right to access data and guaranteeing access to justice.

The proposal does not address aspects related to the regulation of online marketplaces, which should nevertheless be considered by the Digital Services Act Package to be proposed by the Commission.

The proposal raises the need for assessment of the necessity of proper regulation of civil and commercial law aspects in the field of distributed ledger technologies, including blockchains and, in particular, addresses the necessity of the proper regulation of civil and commercial law aspects of smart contracts.

I.   PROPOSALS TO BE INCLUDED IN THE DIGITAL SERVICES ACT

The key elements of the proposals to be included in the Digital Services Act should be:

A regulation on contractual rights as regards content management and that contains the following elements:

It should apply to content management, including content moderation and curation, with regard to content accessible in the Union.

It should provide proportionate principles for content moderation.

It should provide formal and procedural standards for a notice and action mechanism, which are proportionate to the platform and the nature and impact of the harm, effective and future-proof.

It should provide for an independent dispute settlement mechanism in the Member States without limiting access to judicial redress.

It should indicate a set of clear indicators to define the market power of content hosting platforms, in order to determine whether certain content hosting platforms that do not hold significant market power can be exempted from certain provisions. Such indicators could include the size of its network (number of users), its financial strength, access to data, the degree of vertical integration, or the presence of lock-in effect.

It should provide rules regarding the responsibility of content hosting platforms for goods sold or advertised on them taking into account supporting activities for SMEs in order to minimize their burden when adapting to this responsibility.

It should make a clear distinction between illegal and harmful content when it comes to applying the appropriate policy options. In this regard, any measure in the Digital Services Act should concern only illegal content as defined in Union and national law.

It should be based upon established principles as regards determining the law applicable to compliance with administrative law, and should, in light of the increasing convergence of user rights, clearly state that all aspects within its scope are governed by those principles.

It should fully respect the Charter of Fundamental Rights of the European Union and Union rules protecting users and their safety, privacy and personal data, as well as other fundamental rights.

It should provide for a dialogue between content hosting platforms with significant market power and the European entity on the risk management of content management of legal content.

The Commission should consider options for a European entity tasked with ensuring compliance with the provisions of the proposal through the following measures:

regular monitoring of the algorithms employed by content hosting platforms for the purpose of content management;

regular review of the compliance of content hosting platforms with the provisions of the regulation, on the basis of transparency reports provided by the content-hosting platforms and the public database of decisions on removal of content to be established by the Digital Services Act;

working with content hosting platforms on best practices to meet the transparency and accountability requirements for terms and conditions, as well as best practices in content moderation and implementing notice-and-action procedures;

cooperating and coordinating with the national authorities of Member States as regards the implementation of the Digital Services Act;

managing a dedicated fund to assist the Member States in financing the operating costs of the independent dispute settlement bodies described in the regulation, funded by fines imposed on content hosting platforms for non-compliance with the provisions of the Digital Services Act as well as a contribution by content hosting platforms with significant market power;

imposing fines for non-compliance with the Digital Services Act. The fines should contribute to the special dedicated fund intended to assist the Member States in financing the operating costs of the dispute settlement bodies described in the regulation. Instances of non-compliance should include:

failure to implement the provisions of the regulation;

failure to provide transparent, accessible, fair and non-discriminatory terms and conditions;

failure to provide the European entity with access to content management algorithms for review;

failure to submit transparency reports to the European entity;

publishing biannual reports on all of its activities and reporting to Union institutions.

Transparency reports regarding content management should be established as follows:

The Digital Services Act should contain provisions requiring content hosting platforms to regularly publish and provide transparency reports to the European entity. Such reports should be comprehensive, following a consistent methodology, and should include in particular:

information on notices processed by the content hosting platform, including the following:

the total number of notices received, for which types of content, and the action taken accordingly;

the number of notices received per category of submitting entity, such as private individuals, public authorities or private undertakings;

the total number of removal requests complied with and the total number of referrals of content to competent authorities;

the total number of counter-notices or appeals received, as well as information on how they were resolved;

the average lapse of time between publication, notice, counter-notice and action;

information on the number of staff employed for content moderation, their location, education and language skills, as well as any algorithms used to take decisions;

information on requests for information by public authorities, such as those responsible for law enforcement, including the numbers of fully complied with requests and requests that were not or only partially complied with;

information on the enforcement of terms and conditions and information on the court decisions ordering the annulment and/or modification of terms and conditions considered illegal by a Member State.

Content hosting platforms should, in addition, publish their decisions on content removal on a publicly accessible database to increase transparency for users.

The independent dispute settlement bodies to be established by the regulation should issue reports on the number of referrals brought before them, including the number of referrals given heed to.

II.   PROPOSALS ANCILLARY TO THE DIGITAL SERVICES ACT

Measures regarding content curation, data and online advertisements in breach of fair contractual rights of users should include:

Measures to minimise the data collected by content hosting platforms, based on interactions of users with content hosted on content hosting platforms, for the purpose of completing targeted advertising profiles, in particular by imposing strict conditions for the use of targeted personal advertisements and by requiring freely given, specific, informed and unambiguous prior consent of the user. Consent to targeted advertising shall not be considered as freely given and valid if access to the service is made conditional on data processing.

Users of content hosting platforms shall be informed if they are subject to targeted advertising, given access to their profile built by content hosting platforms and the possibility to modify it, and given the choice to opt in or out and withdraw their consent to be subject to targeted advertisements.

Content hosting platforms should make available an archive of sponsored content and advertisements that were shown to their users, including the following:

whether the sponsored content or sponsorship is currently active or inactive;

the timespan during which the sponsored content advertisement was active;

the name and contact details of the sponsor or advertiser, and, if different, on behalf of whom the sponsored content or advertisement was placed;

the total number of users reached;

information on the group of users targeted.

The path to fair implementation of the rights of users as regards interoperability interconnectivity and portability should include:

an assessment of the possibility of defining fair contractual conditions to facilitate data sharing with the aim of addressing imbalances in market power, in particular through the interoperability, interconnectivity and portability of data.

a requirement for platforms with significant market power to provide an application programming interface, through which third-party platforms and their users can interoperate with the main functionalities and users of the platform providing the application programming interface, including third-party services designed to enhance and customise the user experience, especially through services that customise privacy settings as well as content curation preferences;

provisions ensuring that platforms with significant market power providing an application programming interface may not share, retain, monetise or use any of the data they receive from third-party services;

provisions ensuring that the interoperability and interconnectivity obligations may not limit, hinder or delay the ability of content hosting platforms to fix security issues, nor should the need to fix security issues lead to an undue suspension of the application programming interface providing interoperability and interconnectivity;

provisions ensuring that platforms be required by the Digital Services Act to ensure the technical feasibility of the data portability provisions laid down in Article 20(2) of the General Data Protection Regulation;

provisions ensuring that content hosting platforms with significant market power publicly document all application programming interfaces they make available for the purpose of allowing for the interoperability and interconnectivity of services.

The path to the proper regulation of civil and commercial law aspects of distributed ledger technologies, including blockchains and, in particular, smart contracts should comprise:

measures ensuring that the proper legislative framework is in place for the development and deployment of digital services including distributed ledger technologies, such as blockchains and smart contracts;

measures ensuring that smart contracts are fitted with mechanisms that can halt and reverse their execution, in particular given private concerns of the weaker party or public concerns such as those related to cartel agreements and in respect for the rights of creditors in insolvency and restructuring procedures;

measures to ensure appropriate balance and equality between the parties to smart contracts, taking into account, in particular, the interest of small businesses and SMEs, for which the Commission should examine possible modalities;

an update of the existing guidance document on Directive 2011/83/EU in order to clarify whether smart contracts fall within the exemption in point (i) of Article 3(3) of that Directive, as well as issues related to cross-border transactions, notarisation requirements and the right of withdrawal;

The path to equitable private international law rules that do not deprive users of access to justice should:

ensure that standard terms and conditions do not include provisions regulating private international law matters to the detriment of access to justice, in particular through the effective enforcement of existing measures in this regard;

include measures clarifying private international law rules concerning the activities of platforms regarding data, so that they are not detrimental to Union subjects;

build on multilateralism and, if possible, be agreed in the appropriate international fora.

Only where it proves impossible to achieve a solution based on multilateralism in reasonable time, should measures applied within the Union be proposed, in order to ensure that the use of digital services in the Union is fully governed by Union law under the jurisdiction of Union courts.

B.   TEXT OF THE LEGISLATIVE PROPOSAL REQUESTED

Proposal for a

REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL

on contractual rights as regards content management

THE EUROPEAN PARLIAMENT AND THE COUNCIL OF THE EUROPEAN UNION,

Having regard to the Treaty on the Functioning of the European Union, and in particular Article 114 thereof,

Having regard to the proposal from the European Commission,

After transmission of the draft legislative act to the national parliaments,

Having regard to the opinion of the European Economic and Social Committee,

Acting in accordance with the ordinary legislative procedure,

Whereas:

(1)

The terms and conditions that digital service providers apply in relations with users are often non-negotiable and can be unilaterally amended by those providers. Action at a legislative level is needed to put in place minimum standards for such terms and conditions, in particular as regards procedural standards for content management;

(2)

The civil law regimes governing the practices of content hosting platforms as regards content moderation are based on certain sector-specific provisions at Union level as well as on laws passed by Member States at national level, and there are notable differences in the obligations imposed by those civil law regimes on content hosting platforms and in their enforcement mechanisms.

(3)

The resulting fragmentation of civil law regimes governing content moderation by content hosting platforms not only creates legal uncertainties, which might lead such platforms to adopt stricter practices than necessary in order to minimise the risks brought about by the use of their service, but also leads to a fragmentation of the Digital Single Market, which hinders growth and innovation and the development of European businesses in the Digital Single Market.

(4)

Given the detrimental effects of the fragmentation of the Digital Single Market, and the resulting legal uncertainty for businesses and consumers, the international character of content hosting, the vast amount of content requiring moderation, and the significant market power of a few content hosting platforms located outside the Union, the various issues that arise in respect of content hosting need to be regulated in a manner that entails full harmonisation and therefore by means of a regulation.

(5)

Concerning relations with users, this Regulation should lay down minimum standards for the fairness, transparency and accountability of terms and conditions of content hosting platforms. Terms and conditions should be clear, accessible, intelligible and unambiguous and include fair, transparent, binding and uniform standards and procedures for content moderation, which should guarantee accessible and independent recourse to judicial redress and comply with fundamental rights.

(6)

User-targeted amplification of content based on the views or positions presented in such content is one of the most detrimental practices in the digital society, especially in cases where the visibility of such content is increased on the basis of previous user interaction with other amplified content and with the purpose of optimising user profiles for targeted advertisements.

(7)

Algorithms that decide on the ranking of search results influence individual and social communications and interactions and can be opinion-forming, especially in the case of media content.

(8)

In order to ensure, inter alia, that users can assert their rights, they should be given an appropriate degree of transparency and influence over the curation of content made visible to them, including the possibility to opt out of any content curation other than chronological order altogether. In particular, users should not be subject to curation without freely given, specific, informed and unambiguous prior consent. Consent to targeted advertising should not be considered as freely given and valid if access to the service is made conditional on data processing.

(9)

Consent given in a general manner by a user to the terms and conditions of content hosting platforms or to any other general description of the rules relating to content management by content hosting platforms should not be taken as sufficient consent for the display of automatically curated content to the user.

(10)

This Regulation does not oblige content hosting platforms to employ any form of automated ex-ante control of content, unless otherwise specified in existing Union law, and provides that content moderation procedures used voluntarily by platforms are not to lead to ex-ante control measures based on automated tools or upload-filtering of content.

(11)

This Regulation should also include provisions against discriminatory content moderation practices, exploitation or exclusion, for the purposes of content moderation, especially when user-created content is removed based on appearance, ethnic origin, gender, sexual orientation, religion or belief, disability, age, pregnancy or upbringing of children, language or social class.

(12)

The right to issue a notice pursuant to this Regulation should remain with any natural or legal person, including public bodies, to which content is provided through a website or application.

(13)

After a notice has been issued, the uploader should be informed thereof by the content hosting platform and in particular about the reason for the notice and for the action to be taken, and should be provided information about the procedure, including about appeal and referral to independent dispute settlement bodies, and about available remedies in the event of false notices. Such information should, however, not be given if the content hosting platform has been informed by public authorities about ongoing law enforcement investigations. In such case, it should be for the relevant authorities to inform the uploader about the issue of a notice, in accordance with applicable rules.

(14)

All concerned parties should be informed about a decision as regards a notice. The information provided to concerned parties should also include, apart from the outcome of the decision, at least the reason for the decision and whether the decision was made solely by a human, as well as relevant information regarding review or redress.

(15)

Content should be considered as manifestly illegal if it is unmistakably and without requiring in-depth examination in breach of legal provisions regulating the legality of content on the internet.

(16)

Given the immediate nature of content hosting and the often ephemeral purpose of content uploading, it is necessary to provide independent dispute settlement bodies for the purpose of providing quick and efficient extra-judicial recourse. Such bodies should be competent to adjudicate disputes concerning the legality of user-uploaded content and the correct application of terms and conditions. However, that process should not prevent the user from having the right of access to justice and further judicial redress.

(17)

The establishment of independent dispute settlement bodies could relieve the burden on courts, by providing a fast resolution of disputes over content management decisions, without prejudice to the right to judicial redress before a court. Given that content hosting platforms which enjoy significant market power can particularly gain from the introduction of independent dispute settlement bodies, it is appropriate that they contribute to the financing of such bodies. This fund should be independently managed by the European entity in order to assist the Member States in financing the running costs of the independent dispute settlement bodies. Member States should ensure that such bodies are provided with adequate resources to ensure their competence and independence.

(18)

Users should have the right of referral to a fair and independent dispute settlement body, as an alternative dispute settlement mechanism, to contest a decision taken by a content hosting platform following a notice concerning content they uploaded. Notifiers should have that right if they would have legal standing in a civil procedure regarding the content in question.

(19)

As regards jurisdiction, the competent independent dispute settlement body should be that located in the Member State in which the content forming the subject of the dispute has been uploaded. It should always be possible for natural persons to bring complaints to the independent dispute settlement body of their Member States of residence.

(20)

Whistleblowing helps to prevent breaches of law and detect threats or harm to the general interest that would otherwise remain undetected. Providing protection for whistleblowers plays an important role in protecting freedom of expression, media freedom and the public’s right to access information. Directive (EU) 2019/1937 of the European Parliament and of the Council (1) should therefore apply to the relevant breaches of this Regulation. Accordingly, that Directive should be amended.

(21)

This Regulation should include obligations to report on its implementation and to review it within a reasonable time. For this purpose, the independent dispute settlement bodies provided for by Member States under this Regulation should submit reports on the number of referrals brought before them, the decisions taken — anonymising personal data as appropriate — including the number of referrals dealt with, data on systemic problems, trends and the identification of platforms not complying with decisions of independent dispute settlement bodies.

(22)

Since the objective of this Regulation, namely to establish a regulatory framework for contractual rights as regards content management in the Union, cannot be sufficiently achieved by the Member States but can rather, by reason of its scale and effects, be better achieved at Union level, the Union may adopt measures, in accordance with the principle of subsidiarity as set out in Article 5 of the Treaty on European Union. In accordance with the principle of proportionality, as set out in that Article, this Regulation does not go beyond what is necessary in order to achieve that objective.

(23)

Action at Union level as set out in this Regulation would be substantially enhanced by a European entity tasked with appropriate monitoring and ensuring compliance by content hosting platforms with the provisions of this Regulation. For this purpose, the Commission should consider the options of appointing an existing or new European Agency or European body or coordinating a network of national authorities, in order to review compliance with the standards laid down for content management on the basis of transparency reports and the monitoring of algorithms employed by content hosting platforms for the purpose of content management (hereinafter referred to as ‘the European entity’).

(24)

In order to ensure that the risks presented by content amplification are evaluated, a biannual dialogue on the impact of content management policies of legal content on fundamental rights should be established between content hosting platforms with significant market power and the European entity together with relevant national authorities.

(25)

This Regulation respects all fundamental rights and observes the freedoms and principles recognised in the Charter of Fundamental Rights of the European Union as enshrined in the Treaties, in particular the freedom of expression and information, and the right to an effective remedy and to a fair trial,

HAVE ADOPTED THIS REGULATION:

Article 1

Purpose

The purpose of this Regulation is to contribute to the proper functioning of the internal market by laying down rules to ensure that fair contractual rights exist as regards content management and to provide independent dispute settlement mechanisms for disputes regarding content management.

Article 2

Scope of application

1.   This Regulation applies to content hosting platforms that host and manage content that is accessible to the public on websites or through applications in the Union, irrespective of the place of establishment or registration, or principal place of business of the content hosting platform.

2.   This Regulation does not apply to content hosting platforms that:

(a)

are of a non-commercial nature; or

(b)

have fewer than [100 000] (2) users.

Article 3

Definitions

For the purposes of this Regulation, the following definitions apply:

(1)

‘content hosting platform’ means an information society service within the meaning of point (b) of Article 1(1) of Directive (EU) 2015/1535 of the European Parliament and of the Council (3) of which the main or one of the main purposes is to allow signed-up or non-signed-up users to upload content for display on a publicly accessible website or application;

(2)

‘content hosting platform with significant market power’ means a content hosting platform with at least two of the following characteristics:

(a)

the capacity to develop or preserve its user base because of network effects which lock-in a significant part of its users, or because its positioning in the downstream market allows it to create economic dependency;

(b)

being of a considerable size in the market, measured either by the number of active users or by the annual global turnover of the platform;

(c)

it is integrated into a business or network environment controlled by its group or parent company, which allows for market power to be leveraged from one market into an adjacent market;

(d)

it has a gatekeeper role for a whole category of content or information;

(e)

it has access to large amounts of high quality personal data, either provided by users or inferred about users based on monitoring their online behaviour, and such data are indispensable for providing and improving a similar service, as well as being difficult to access or replicate by potential competitors;

(3)

‘content’ means any concept, idea, form of expression or information in any format such as text, images, audio and video;

(4)

‘illegal content’ means any content which is not in compliance with Union law or the law of a Member State in which it is hosted;

(5)

‘content management’ means the moderation and curation of content on content hosting platforms;

(6)

‘content moderation’ means the practice of monitoring and applying a pre-determined set of rules and guidelines to content generated, published or shared by users, in order to ensure that the content complies with legal and regulatory requirements, community guidelines and terms and conditions, as well as any resulting measure taken by the platform, such as removal of content or the deletion or suspension of the user’s account, be it through automated means or human operators;

(7)

‘content curation’ means the practice of selecting, optimising, prioritising and recommending content based on individual user profiles for the purpose of its display on a website or application;

(8)

‘terms and conditions’ means all terms, conditions or specifications, irrespective of their name or form, which govern the contractual relationship between the content hosting platform and its users and which are unilaterally determined by the content hosting platform;

(9)

‘user’ means a natural or legal person that uses the services provided by a content hosting platform or interacts with content hosted on such a platform;

(10)

‘uploader’ means a natural or legal person that adds content to a content hosting platform irrespective of its visibility to other users;

(11)

‘notice’ means a formalised notification contesting the compliance of content with legal and regulatory requirements, community guidelines and terms and conditions.

Article 4

Principles for content management

1.   Content management shall be conducted in a fair, lawful and transparent manner. Content management practices shall be appropriate, proportionate to the type and volume of content, relevant and limited to what is necessary in relation to the purposes for which the content is managed. Content hosting platforms shall be accountable for ensuring that their content management practices are fair, transparent and proportionate.

2.   Users shall not be subjected to discriminatory practices, exploitation or exclusion, for the purposes of content moderation by the content hosting platforms, such as removal of user-generated content based on appearance, ethnic origin, gender, sexual orientation, religion or belief, disability, age, pregnancy or upbringing of children, language or social class.

3.   Content hosting platforms shall provide the users with sufficient information on their content curation profiles and the individual criteria according to which content hosting platforms curate content for them, including information as to whether algorithms are used and their objectives.

4.   Content hosting platforms shall provide users with an appropriate degree of influence over the curation of content made visible to them, including the choice of opting out of content curation altogether. In particular, users shall not be subject to content curation without their freely given, specific, informed and unambiguous prior consent.

Article 5

Structured risk dialogue on content management

As part of a structured risk dialogue with the European entity together with the relevant national authorities, content hosting platforms with significant market power shall present a biannual report to the European entity on the fundamental rights impact and on their risk management of their content management policies and how they mitigate those risks.

Article 6

Transparency obligation

1.   Digital services providers shall take the measures necessary to enable the disclosure of the funding of any interest groups with which the users of the providers’ digital services are associated, and of details of the nature of the relationship between such interest groups and users. Such disclosure shall enable the person who is legally responsible to be identified.

2.   Commercial digital service providers who are established outside the Union shall designate a legal representative for the purposes of user interests within the Union and make the contact information of that representative visible and accessible on their online platforms.

Article 7

Eligibility for issuing notices

1.   Any natural or legal person or public body to which content is provided through a website, application, or other form of software, shall have the right to issue a notice pursuant to this Regulation.

2.   Member States shall provide for penalties where a person acting for purposes relating to their trade, business, craft or profession systematically and repeatedly submits wrongful notices. Such penalties shall be effective, proportionate and dissuasive.

Article 8

Notice procedures

Content hosting platforms shall include in their terms and conditions clear, accessible, intelligible and unambiguous information regarding notice procedures, in particular:

(a)

the maximum period within which the uploader of the content in question is to be informed about a notice procedure;

(b)

the period within which the uploader can launch an appeal;

(c)

the deadline for the content hosting platform to expeditiously treat a notice and take a decision;

(d)

the deadline for the content hosting platform to inform both parties about the outcome of the decision including a justification for the action taken.

Article 9

Content of notices

1.   A notice regarding content shall include at least the following information:

(a)

a link to the content in question and, where appropriate, such as regarding video content, a timestamp;

(b)

the reason for the notice;

(c)

evidence supporting the claim made in the notice;

(d)

a declaration of good faith from the notifier; and

(e)

in the event of a violation of personality rights or intellectual property rights, the identity of the notifier.

2.   In the event of violations referred to in point (e) of paragraph 1, the notifier shall be the person concerned by the violation of personality rights, or the holder of the intellectual property rights that were violated, or someone acting on behalf of that person.

Article 10

Information to the uploader

1.   Upon a notice being issued, and before any decision on the content has been made, the uploader of the content in question shall receive the following information:

(a)

the reason for the notice and for the action the content hosting platform might take;

(b)

sufficient information about the procedure to follow;

(c)

information on the right of reply laid down in paragraph 3; and

(d)

information on the available remedies in relation to false notices.

2.   The information required under paragraph 1 shall not be provided if the content hosting platform has been informed by public authorities about ongoing law enforcement investigations.

3.   The uploader shall have the right to reply to the content hosting platform in the form of a counter-notice. The content hosting platform shall consider the uploader’s reply when taking a decision on the action to be taken.

Article 11

Decisions on notices

1.   Content hosting platforms shall ensure that decisions on notifications are taken by qualified staff without undue delay following the necessary investigations.

2.   Following a notice, content hosting platforms shall, without delay, decide whether to remove, take down or disable access to content that was the subject of a notice, if such content does not comply with legal requirements. Without prejudice to Article 14(2), the fact that a content hosting platform has deemed specific content to be non-compliant shall in no case automatically lead to content by another user being removed, taken down or being made inaccessible.

Article 12

Information about decisions

Once a content hosting platform has taken a decision, it shall inform all parties involved in the notice procedure about the outcome of the decision, providing the following information in a clear and simple manner:

(a)

the reasons for the decision taken;

(b)

whether the decision was made solely by a human or supported by an algorithm;

(c)

information about the possibility for review as referred to in Article 13 and judicial redress for either party.

Article 13

Review of decisions

1.   Content hosting platforms may provide a mechanism allowing users to request a review of decisions they take.

2.   Content hosting platforms with significant market power shall provide the review mechanism referred to in paragraph 1.

3.   In all cases, the final decision of the review shall be undertaken by a human.

Article 14

Removal of content

1.   Without prejudice to judicial or administrative orders regarding content online, content that has been the subject of a notice shall remain visible while the assessment of its legality is still pending.

2.   Content hosting platforms shall act expeditiously to make unavailable or remove content which is manifestly illegal.

Article 15

Independent dispute settlement

1.   Member States shall provide independent dispute settlement bodies for the purpose of providing quick and efficient extra-judicial recourse when decisions on content moderation are appealed against.

2.   The independent dispute settlement bodies shall be composed of independent legal experts with the mandate to adjudicate disputes between content hosting platforms and users concerning the compliance of the content in question with legal and regulatory requirements, community guidelines and terms and conditions.

3.   The referral of a dispute regarding content moderation to an independent dispute settlement body shall not preclude a user from being able to have further recourse in the courts unless the dispute has been settled by common agreement.

4.   Content hosting platforms with significant market power shall contribute financially to the operating costs of the independent dispute settlement bodies through a dedicated fund managed by the European entity, in order to assist the Member States in financing those bodies. Member States shall ensure the independent dispute settlement bodies are provided with adequate resources to ensure their competence and independence.

Article 16

Procedural rules for independent dispute settlement

1.   The uploader as well as a third party, such as an ombudsperson with a legitimate interest in acting, shall have the right to refer a case of content moderation to the competent independent dispute settlement body in the event that a content hosting platform has decided to remove, take down or disable access to content, or otherwise to act in a manner that is contrary to the action preferred by the uploader as expressed by the uploader or constitutes an infringement of fundamental rights.

2.   Where the content hosting platform has decided not to take down content that is the subject of a notification, the notifier shall have a right to refer the matter to the competent independent dispute settlement body, provided that the notifier would have legal standing in a civil procedure regarding the content in question.

3.   As regards jurisdiction, the competent independent dispute settlement body shall be that located in the Member State in which the content that is the subject of the dispute has been uploaded. Natural persons shall be allowed in all cases to bring complaints to the independent dispute body of their Member States of residence.

4.   Where the notifier has the right to refer a case of content moderation to an independent dispute settlement body in accordance with paragraph 2, the notifier may refer the case to the independent dispute settlement body located in the Member State of habitual residence of the notifier or the uploader, if the latter is using the service for non-commercial purposes.

5.   Where a case of content moderation relating to the same question is the subject of a referral to another independent dispute settlement body, the independent settlement body may suspend the procedure as regards a referral. Where a question of content moderation has been the subject of recommendations by an independent dispute settlement body, the independent dispute settlement body may decline to treat a referral.

6.   The Member States shall lay down all other necessary rules and procedures for the independent dispute settlement bodies within their jurisdiction.

Article 17

Personal data

Any processing of personal data carried out pursuant to this Regulation shall be carried out in accordance with Regulation (EU) 2016/679 of the European Parliament and of the Council (4) and Directive 2002/58/EC of the European Parliament and of the Council (5).

Article 18

Reporting of breaches and protection of reporting persons

Directive (EU) 2019/1937 shall apply to the reporting of breaches of this Regulation and to the persons reporting such breaches.

Article 19

Amendments to Directive (EU) 2019/1937

Directive (EU) 2019/1937 is amended as follows:

(1)

in point (a) of Article 2(1), the following point is added:

‘(xi)

online content management;’;

(2)

in Part I of the Annex, the following point is added:

‘K.

Point (a)(xi) of Article 2(1) — online content management.

Regulation [XXX] of the European Parliament and of the Council on contractual rights as regards content management.’.

Article 20

Reporting, evaluation and review

1.   Member States shall provide the Commission with all relevant information regarding the implementation and application of this Regulation. On the basis of the information provided and of public consultation, the Commission shall, by … [three years after entry into force of this Regulation], submit a report to the European Parliament and to the Council on the implementation and application of this Regulation and consider the need for additional measures, including, where appropriate, amendments to this Regulation.

2.   Without prejudice to reporting obligations laid down in other Union legal acts, Member States shall, on an annual basis, submit the following statistics to the Commission:

(a)

the number of disputes referred to independent dispute settlement bodies and the types of content that were the subject of disputes;

(b)

the number of cases settled by the independent dispute settlement bodies, categorised according to outcome.

Article 21

Entry into force

This Regulation shall enter into force on the twentieth day following that of its publication in the Official Journal of the European Union.

It shall apply from XX.

This Regulation shall be binding in its entirety and directly applicable in all Member States.

Done at …,

For the European Parliament

The President

For the Council

The President


(1)  Directive (EU) 2019/1937 of the European Parliament and of the Council of 23 October 2019 on the protection of persons who report breaches of Union law (OJ L 305, 26.11.2019, p. 17).

(2)  When determining the number of users, the Commission should take into account the situation of SMEs and start-ups.

(3)  Directive (EU) 2015/1535 of the European Parliament and of the Council of 9 September 2015 laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services (OJ L 241, 17.9.2015, p. 1).

(4)  Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (OJ L 119, 4.5.2016, p. 1).

(5)  Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications) (OJ L 201, 31.7.2002, p. 37).


Top