Brussels, 15.12.2020

SWD(2020) 348 final

COMMISSION STAFF WORKING DOCUMENT

IMPACT ASSESSMENT

Accompanying the document

PROPOSAL FOR A REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL

on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC

{COM(2020) 825 final} - {SEC(2020) 432 final} - {SWD(2020) 349 final}


Table of contents

1.Introduction: political and legal context

2.Problem definition

2.1.Context and scope

2.2.What are the problems?

2.3.What are the problem drivers?

2.4.How will the problem evolve?

2.5.Problem tree

3.Why should the EU act?

3.1.Legal basis

3.2.Subsidiarity: Necessity of EU action

3.3.Subsidiarity: Added value of EU action

4.Objectives: What is to be achieved?

4.1.General objectives

4.2.Specific objectives

4.3.Intervention logic

5.What are the available policy options?

5.1.What is the baseline from which options are assessed?

5.2.Description of the policy options

5.3.Options discarded at an early stage

6.What are the impacts of the policy options?

6.1.Economic impacts

6.2.Social impacts

6.3.Impacts on fundamental rights

6.4.Environmental impacts

7.How do the options compare?

7.1.Criteria for comparison

7.2.Summary of the comparison

8.Preferred option

9.REFIT (simplification and improved efficiency)

10.How will actual impacts be monitored and evaluated?

Glossary

Term or acronym

Meaning or definition

Collaborative economy platform

an online platform ensuring an open marketplace for the temporary usage of goods or services often provided by private individuals. Examples include temporary accommodation platforms, ride-hailing or ride-sharing services.

Competent authorities

the competent authorities designated by the Member States in accordance with their national law to carry out tasks which include tackling illegal content online, including law enforcement authorities and administrative authorities charged with enforcing law, irrespective of the nature or specific subject matter of that law, applicable in certain particular fields.

Content provider

a user who has submitted information that is, or that has been, stored at his or her request by a hosting service provider.

CSAM

Child Sexual Abuse Material, for the purposes of this IA refers to any material defined as ‘child pornography’ and ‘pornographic performance’ in Directive 2011/93/EU

Digital service

used here as synonym to an information society service – see definition below

Erroneous removal

the removal of content, goods or services offered online where such removal was not justified by the illegal nature of the content, goods, or services, or the terms and conditions of the online service, or any other reason justifying the removal of content, goods or services. 

FTE

Full time equivalent

Harmful behaviours/activities online

while some behaviours are prohibited by the law at EU or national level (see definitions for illegal content and illegal goods), other behaviours could potentially result in diverse types of harms, without being illegal as such. A case in point are coordinated disinformation campaigns which may lead to societal impact or individual harm under certain conditions. Some content can also be particularly damaging for vulnerable categories of users, such as children, but not for the general public. Such notions remain, to a certain extent, subjective.  

Hosting service provider

a provider of information society services consisting of the storage of information provided by the recipient of the service at her request, examples include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloud services

Illegal content

any information which is not in compliance with Union law or the law of a Member State concerned;

Illegal activity

any activity which is not in compliance with Union law or the law of a Member State concerned;

Illegal goods or services

refer to the illegal sale of goods or services, as defined in EU or national law. Examples include the sale of counterfeit or pirated goods, of dangerous or non-compliant products (i.e. food or non-food products which do not comply with the health, safety, environmental and other requirements laid down in European or national law), of products which are illegally marketed, of endangered species.

Illegal hate speech

The following serious manifestations of racism and xenophobia that must constitute an offence in all EU countries:

(a) public incitement to violence or hatred in respect of a group of persons or a member of such a group defined by reference to colour, race, religion or national or ethnic origin;

(b) public condoning, for a racist or xenophobic purpose, of crimes against humanity and human rights violations;

(c) public denial of the crimes defined in Article 6 of the Charter of the International Military Tribunal appended to the London Agreement of 8 April 1945 insofar as it includes behaviour which is contemptuous of, or degrading to, a group of persons defined by reference to colour, race, religion or national or ethnic origin;

(d) public dissemination or distribution of tracts, pictures or other material containing expressions of racism and xenophobia;

(e) participation in the activities of groups, organizations or associations, which involve discrimination, violence, or racial, ethnic or religious hatred.

Information Society Service

a service ‘normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services’, as defined in Directive (EU) 2015/1535. The definition covers a very large category of services, from simple websites, to online intermediaries such as online platforms, or internet access providers.

Very large online platforms

online platforms with a significant societal and economic impact by covering, among their monthly users, at least 10% of the EU population (approximately 45 million users).

Law enforcement authorities

the competent authorities designated by the Member States in accordance with their national law to carry out law enforcement tasks for the purposes of the prevention, investigation, detection or prosecution of criminal offences, including in connection to illegal content online;

Notice

any communication to a hosting service provider that gives the latter knowledge of a particular item of illegal content that it transmits or stores and therefore creates an obligation for it to act expeditiously by removing the illegal content or disabling/blocking access to it. Such an obligation only arises if the notice provides the internet hosting service provider with actual awareness or knowledge of illegal content.

Online platforms

a variety of ‘hosting service providers’ such as social networks, content-sharing platforms, app stores, online marketplaces, ride-hailing services, online travel and accommodation platforms. Such services are generally characterised by their intermediation role between different sides of the market – such as sellers and buyers, accommodation service providers, or content providers – and oftentimes intermediate access of user-generated content. 

Online intermediary service

digital service that consist of transmitting or storing content that has been provided by a third party, the E-commerce Directive distinguishes three types of intermediary services: mere conduit (transmitting of data by an internet access provider), caching (i.e. automatically making temporary copies of web data to speed up technical processes) and hosting  

Recommender systems

refer to the algorithmic systems used by online platforms to give prominence to content or offers, facilitating their discovery by the users. Recommender systems follow a variety of criteria and designs, sometimes personalised for the users, based on their navigation history, profiles, etc., other times based purely on the content analogy or ratings.  

Trusted flagger/third party

an individual or entity which is considered by a hosting service provider to have particular expertise and responsibilities for the purposes of tackling illegal content online;

Users

Refers, throughout the report, to any natural or legal person who is the recipient of a digital service 

1.Introduction: political and legal context

1The President of the Commission announced as one of her political priorities 1 a new Digital Services Act as a key measure in her agenda for shaping Europe’s digital future 2 , to establish a fair and competitive digital economy and to build an open, democratic and sustainable society. The Digital Services Act together with the Digital Markets Act are intended as a comprehensive package of measures for the provision of digital services in the European Union and seek to address in particular the challenges posed by online platforms.

2In the Digital Services Act, which is underpinned by this impact assessment report, the intervention focuses on deepening the single market for digital services and establishing clear responsibilities for online platforms as well as other intermediary services to protect their users from the risks they pose, such as illegal activities online and risk to their fundamental rights. The Digital Markets Act complements these provisions and focuses on the gatekeeper role and unfair practices by a prominent category of online platforms.

3Digital services have become an important backbone of the digital economy and have deeply contributed to societal transformations in the EU and across the world. At the same time, they also raise significant new challenges. It is for this reason that updating the regulatory framework for digital services has become a priority, not only in the European Union, but also around the globe.

4In the Communication ‘Shaping Europe’s Digital Future’ 3 , the Commission made a commitment to update the horizontal rules that define the responsibilities and obligations of providers of digital services, and online platforms in particular.

5Both the European Parliament and the Council of the European Union share the sense of urgency to establish at EU level a renewed and visionary framework for digital services. The European Parliament proposed three own initiative reports, focusing on specific aspects in the provision of digital services: considerations for the single market, responsibilities for online platforms for tackling illegal content, and protection of fundamental rights online. 4 The Council’s Conclusions 5 welcomed the Commission’s announcement of a Digital Services Act, emphasised ‘the need for clear and harmonised evidence-based rules on responsibilities and accountability for digital services that would guarantee internet intermediaries an appropriate level of legal certainty’, and stressed ‘the need to enhance European capabilities and the cooperation of national authorities, preserving and reinforcing the fundamental principles of the Single Market and the need to enhance citizens’ safety and to protect their rights in the digital sphere across the Single Market’. The call was reiterated in the Council’s Conclusions of 2nd October 2020 6 .

6Not only governments and legislators have expressed the need to respond to the changes in the digital landscape. The nearly 3000 contributions received in response to the most recent open public consultation concerning this initiative highlight the significant public interest to re-imagine how digital services influence our daily lives.

7The challenge of addressing the changed and increasingly complex ecosystem of digital services is not only an EU endeavour, but also prominent at international level. It is discussed at the UN, Council of Europe, OSCE, WTO, and OECD and it is regularly on the agenda of G7/G20 meetings. It is also high on the agenda of many third country jurisdictions across the world.

8The EU has a wide range of trade commitments in sectors covering digital services. This initiative will be in full compliance with the EU’s international obligations, notably in the multilateral agreements in the World Trade Organisation and in its regional trade agreements.

9Besides the Treaty provisions, the basic framework regulating the provision of digital services in the internal market is defined in the E-Commerce Directive dating from 2000. The goal of that directive is to allow borderless access to digital services across the EU and to harmonise the core aspects for such services, including information requirements and online advertising rules, as well as setting the framework for the liability regime of intermediary services – categorised as ‘mere conduits’, ‘caching services’, and ‘hosting services’ – for third party content.

10Since then, the nature, scale, and importance of digital services for the economy and society has dramatically changed. Business models, which emerged with large online platforms such as social networks or marketplaces, have changed the landscape of digital services in the EU. These services are now used by a majority of EU citizens on a daily basis, and are based on multi-sided business models underpinned by strong network effects.

11In response to the evolving digital landscape, several service-specific and sector-specific legal acts several have complemented the E-Commerce Directive by regulating different issues concerning the provision of digital services, such as revised data protection rules, copyright rules and rules concerning audiovisual services or consumer acquis.

12The Court of Justice of the EU has contributed to the uniform interpretation and application of the E-Commerce Directive, by interpreting and reaffirming its core principles in the context of new digital services and technologies.

13More recently, the Commission has also taken a series of targeted measures, both legislative 7 and self-regulatory 8 , as well as coordinated enforcement actions in the framework of the Consumer Protection Cooperation Regulation (CPC) 9 , for addressing the spread of certain types of illegal activities online such as copyright-protected content, practices infringing EU consumer law, dangerous goods, illegal hate speech, terrorist content, or child sexual abuse material. These targeted measures do not address, however, the systemic risks posed by the provision and the use of digital services, nor the re-fragmentation of the single market and the competition imbalances brought about by the emergence of very large digital service providers on a global scale.

14This impact assessment explores the changed nature, scale and influence of digital services, in particular online platforms. The assessment tracks key drivers which have led to societal and economic challenges posed by the digital services ecosystem and outlines the options to address them and improve the functioning of the digital single market. This Impact Assessment builds on the evaluation 10 of the E-Commerce Directive 11 , annexed to the report.

2. Problem definition

2.1.Context and scope

15 Digital services 12 have been defined as ‘services normally provided against remuneration, at a distance, by electronic means and at the individual request of a recipient of services’. This definition covers in principle a wide-scope of very diverse services, including: 

- apps, online shops, e-games, online versions of traditional media (newspapers, music stores), Internet-of-Things applications, some smart cities’ services, online encyclopaedias, payment services, online travel agents, etc., but also

-services provided by ‘online intermediaries’, ranging from the very backbone of the internet infrastructure, with internet service providers, cloud infrastructure services, content distribution networks, to messaging services, online forums, online platforms (such as app stores, e-commerce marketplaces, video-sharing and media-sharing platforms, social networks, collaborative economy platforms etc.) or ads intermediaries.

16All of these services have evolved considerably over the past 20 years as many new ones have appeared. The landscape of digital services continues to develop and change rapidly along with technological transformations and the increasing availability of innovations 13 .

17For e-commerce alone (for services and goods sold online), the increase has been steady over the past 20 years. Today around 20% of European businesses are involved in e-commerce. Out of those who sell goods online, 40% are using online marketplaces to reach their customers. 14 Whereas in 2002, shortly after the entry into force of the E-Commerce Directive, only 9% of Europeans were buying goods online, over 70% shop online today. 15  

18A study conducted for the European Parliament 16 emphasises the strategic importance of e-commerce and digital services in boosting the opportunities for SMEs to access new markets and new consumer segments, accelerating their growth, affording lower prices for consumers (2% to 10% advantage compared to offline sales), and enhancing territorial cohesion in the Union, blurring geographic dependencies between markets. The study estimates overall welfare gains from e-commerce to be between 0.3 and 1.7% of EU-27 GDP.

19While some online platforms did exist at the end of the 1990s, their scale, reach and business models were in no way comparable to their current influence in the market and the functioning of our societies. In 2018, 76% of Europeans said 17 that they were regular users of video-sharing or music streaming platforms, 72% shopped online and 70% used social networks. Through the advent of online platforms, many more economic activities were open to online consumption, such as transport services and short-term accommodation rental, but also media production and consumption and important innovations were brought by user-generated content.

20Online advertising services are an area of particular evolution over the past 20 years: whereas online commercial communications started with simple email distribution lists, they are now an enormous industry 18 , with several types of intermediaries involved in the placement of ads.

21The evolution is not limited to consumer-facing digital services, far from it. In particular, in what concerns online intermediaries providing the technical infrastructure of the internet, technological developments and improvement of capabilities have been staggering. The core internet infrastructure set by internet access services and DNS operators is now also supported by other types of technical services such as content delivery networks (CDN), or cloud infrastructure services. They are all fundamental for any other web application to exist and their actions have a major impact on the core access to internet services and information. The resilience, stability and security of core services such as DNS are a precondition for digital services to be effectively delivered to and accessed by internet users.

22While online platforms present particular opportunities and concerns and are most prominently referred to, all these intermediary services have a strategic importance for the development of virtually all sectors in the European economy and, increasingly so, in social interactions and societal transformations. There are several key observations to note:

23First, digital services are inherently cross-border services. The ability to provide and access any digital service from anywhere in the Union is increasingly a feature citizens expect, while also expecting to be well protected from illegal content and activities. This raises the stakes when barriers arise for the provision of digital services, in particular to maintain a rich, diverse, and competitive landscape of digital services that can thrive in the EU.

24Second, online intermediary services are vital for the backbone of the internet (e.g. internet access services, cloud infrastructure, DNS) and agile innovators and first users of new technologies (from internet of things, to artificial intelligence). They are a strategic sector for the European economy, and a core engine for the digital transformation.

25Third, the particular business model of online platforms has emerged over the last two decades, connecting users with suppliers of goods, content or services. These online platforms are often characterised as multi-sided markets, benefiting from very strong network effects. The value of the platform service increases rapidly as the number of users increase.

26Fourth, while such platforms are traditionally major innovators in terms of services and products, they now have become the source of new risks and challenges for their users and society at large.

27Fifth, while there are approximately 10.000 19 micro, small or medium size online platforms, millions of users concentrate around a small number of very large online platforms, be it in e-commerce, social networks, video-sharing platforms etc. This transforms such very large platforms into de facto public spaces for businesses to find consumers, for authorities, civil society or politicians to connect with citizens and for individuals to receive and impart information.

28Such large platforms have come to play a particularly important role in our society and our economy, different in scale and scope from that of other similar services with lower reach. The way they organise their services has a significant impact, e.g. on the offer of illegal goods and content online, as well as in defining ‘choice architecture’ that determines the options that users have in accessing goods, content, or services online.

29Finally, societal trends related to how we use technology, work, learn or shop are changing rapidly. While these trends were already unfolding before the COVID-19 outbreak, we are seeing an acceleration of the digitalization trend, which is likely to lead to a ‘new normal’ after the COVID-19 crisis and an even more important role for digital services in our daily lives in the future. Online sales of basic goods alone have grown by 50% on average in Europe 20 since the offset of the pandemics. At the same time, the crisis has exposed the weaknesses of our reliance on digitalization, as we have seen an important growth in platform-enabled crime, such as COVID-19-related scams and exchange of child sexual abuse material 21 .

30Against this background, the Impact Assessment covers all types of online intermediaries, with a particular focus on online platforms and the risks and harms they may represent, and the challenges they are facing in the Single Market.

2.2.What are the problems?

31This Impact Assessment analyses the three core issues related to the governance of digital services in the European single market, as follows:

Table 1 Summary of main problems and scope

Main problems

For whom is this a problem?

Main types of digital services concerned

Other stakeholders primarily affected

1.Serious societal and economic risks and harms of online intermediaries: illegal activities online, insufficient protection of the fundamental rights and other emerging risks

Illegal activities and risks to fundamental rights: all types of online intermediaries, with particular impacts where online platforms are concerned

Other emerging risks: primarily related to online platforms

Citizens and consumers

Businesses prejudiced by illegal activities

Law enforcement

2.Ineffective supervision of services & insufficient administrative cooperation, creating hurdles for services and weakening the single market

Mostly as regards supervision of online platforms, with particular challenges where platforms cover a large part of the single market

Citizens

National authorities

3.Legal barriers for services: preventing smaller companies from scaling up and creating advantages for large platforms, equipped to bear the costs

In particular online platforms as primarily targeted by the legal fragmentation, but also other online intermediaries

Businesses depending on online intermediaries

32The Impact Assessment builds on the evaluation of the E-Commerce Directive in Annex 5. This evaluation concludes the following main points.

Box 1: Main conclusions and issues emerging from the Evaluation Report

First, the evaluation concludes that the core principles of the E-Commerce Directive regulating the functioning of the internal market for digital services remain very much valid today. The evaluation shows that the directive enabled growth and accessibility of digital services cross-border in the internal market. This concerns all layers of the internet and the web and has enabled successful entry and growth of many EU companies in different segments of the market.

At the same time, the evaluation points to clear evidence of legal fragmentation and differentiated application of the existing rules by Member States, and ultimately by national courts. There is also an increased tendency of Member States to adopt legislation with extraterritorial effects and enforce it against service providers not established in their territory. Such enforcement in consequence reduces trust between competent authorities and undermines the well-functioning internal market as well as the existing cooperation mechanisms.

In this context, the evaluation also shows that Member States make little use of the cooperation mechanism provided for in the E-Commerce Directive. The evaluation shows that the existing mechanism, whose existence is still considered very relevant and important overall for the functioning of the single market for digital services, requires a more effective set-up to ensure trust between Member States and an effective supervision and sanctioning of digital services posing particular challenges, such as online platforms.

Second, the evaluation concludes that the liability regime for online intermediaries continues to establish the key regulatory pillar enabling conditions for the existence and growth of intermediary services as well as for the fair balance in the protection of fundamental rights online. If, in 1996, the Commission signalled that the objective when discussing the liability and responsibilities of intermediaries in respect of stored user content was “to design a legal regime that assists ‘host service providers, whose primary business is to provide a service to customers, to steer a path between accusations of censorship and exposure to liability’ 22 , that objective remains equally valid today.

The evaluation shows that the liability regime for online intermediaries provided for a necessary minimum of legal certainty for online intermediaries as initially pursued. However, conflicting interpretations in national court cases (sometimes even within the same Member State) have introduced a significant level of uncertainty; in addition, an increasing fragmentation of the single market raises barriers for EU scale-ups to emerge. Furthermore, the evaluation also shows that the relevant provisions have only partially achieved the balancing objective of protecting fundamental rights. They provide stronger incentives for the removal of content than to protect legal content and also lack appropriate oversight as well as due process mechanisms especially in situations where the subsequent action is taken by private sector entities, rather than public sector authorities.

In addition, the existing categories defining online intermediaries are somewhat outdated, in particular in light of the evolution of services and underlying technology. Some providers exercise a clear influence over the hosted content, leading the user to confusion as to the identity or origin of the goods or services she or he views – blurring the line of what is expected from an intermediary. Finally, without prejudice to the exemption of liability, the current framework lacks necessary obligations on due diligence as regards third party content to ensure that risks brought by the dissemination of illegal content, goods or services online are appropriately addressed.

Third, the evaluation shows that a series of transparency and consumer-facing provisions 23 included in the Directive are still relevant. The provisions have set the minimum conditions for consumer trust and provision of digital services and have been largely complemented – but not overwritten - by a rich corpus of further rules and harmonisation measures in the areas such as consumer protection and conclusion of contracts at a distance, including by online means. This is not to say there are no challenges; several enforcement actions by the Consumer Protection Cooperation (CPC) Network, show that some provisions, such as basic information requirements, suffer from a patchy and diverging application in practice. Furthermore, the fundamental changes in the variety and scale of information society services, as well as of the technologies deployed and online behaviour, have led to the emergence of new challenges, not least in terms of transparency of online advertising and algorithmic decision-making consumers and businesses are subject to.

33The following sub-sections present in more detail the problems identified and their causes, as well as the expected evolution of the problems.

Serious risks and harms brought by digital services

European citizens are exposed to increasing risks and harms online – from the spread of illegal activities, to infringements of fundamental rights and other societal harms. These issues are widespread across the online ecosystem, but they are most impactful where very large online platforms are concerned, given their wide reach and audiences. Such platforms play today a systemic role in amplifying and shaping information flows online. Their design choices have a strong influence on user safety online, the shaping of public opinion and discourse, as well as on online trade. Such design choices can cause societal concerns, but are generally optimised to benefit the often advertising-driven business models of platforms. In the absence of effective regulation and enforcement, platforms set the rules of the game, without effectively mitigating the risks and the societal and economic harm they cause.

a)Illegal activities online

34The use of digital services and the opportunities these services provide for electronic commerce and information sharing is now present throughout society and the economy. Correspondingly, the misuse of services for illegal activities has also expanded significantly. This includes illegal activities, as defined at both European and at national level, such as:

-the sale of illegal goods, such as dangerous goods, unsafe toys, illegal medicines, counterfeits, scams and other consumer protection infringing practices, or even wildlife trafficking, illegal sale of protected species, etc.;

-the dissemination of illegal content such as child sexual abuse material, terrorist content, illegal hate speech and illegal ads targeting individuals, IPR infringing content, etc.;

-the provision of illegal services such as non-compliant accommodation services on short-term rental platforms, illegal marketing services, services infringing consumer protection provisions, or non-respect for extended producer responsibility obligations.

Scale of the spread of illegal content and activities

35The scale of the spread of illegal activities varies and the data available for accurately measuring these phenomena is scarce. Quantitative indications are generally only available as approximations, usually based on detected crimes. As a result, the actual occurrence of illegal activities online is expected to be higher than the reported indicators as many activities are likely to go unreported. At the same time, in particular large online platforms regularly release some reports including content removal figures. Even though such removals are usually based on standards of private community rules, covering not only illegal content but also harmful content and other content breaching the terms of service, the reported numbers can give upper bound indications.

Box 2: Scale of illegal activities: some examples

It is estimated that total imports of counterfeit goods in Europe amounted to EUR 121 billion in 2016 24 , and 80% of products detected by customs authorities involved small parcels 25 , assumed to have been bought online internationally through online market places or sellers’ direct websites. Consumers are buying increasingly more from producers based outside of Europe (from 14% in 2014 to 27% in 2019). 26

For dangerous products, the Rapid Alert System for dangerous non-food products (Safety Gate/RAPEX) registers between 1850 and 2250 notifications from Member States per year 27 . In 2019, around 10% were confirmed to be also related to online listings, while the availability of such products online is very likely higher. Consumer organisations reported on investigations in which known non-compliant goods were made available via online market-places without any checks, detection, or hindrance 28 . In this regard, the COVID-19 crisis has also cast a spotlight on the proliferation of illegal goods online, breaching EU safety and protection requirements or even bearing false certificates of conformity 29 , especially coming from third countries. The coordinated action of the CPC authorities targeting scams related to COVID-19 obliged online platforms to remove millions of misleading offers aimed at EU consumers 30 .

When it comes to categories of illegal content online, for child sexual abuse material the US hotline, which processes the largest number of reports, the National Centre for Missing and Exploited Children, has seen a significant growth in reports globally reaching 16.9 million in 2019, which is a doubling from 8.2 million in 2016 31 . This trend is confirmed by the EU network of hotlines, INHOPE, which indicate that images processed between 2017 and 2019 almost doubled 32 . It is important to note that reports have multiple images and that the illegality is subject to verification by the clearing houses, INHOPE statistics, show that upwards of 70% of images reported are illegal.

For illegal hate speech, it is particularly difficult to estimate the volumes and spread of content, not least since most of the information available refers to platforms’ own definitions of hate speech and not to legal definitions, such as the EU-level reference 33 . As an example, Facebook reported 34 to have taken action in April-June 2019 against 4.4 million pieces of content considered hate speech according to the definition of its community standards 35 and, comparatively, 22.5 million in the same period in 2020. Further, even where minimum standards were set for reporting hate speech under the national legislation, such as NetzDG 36 in Germany, individual companies’ implementation renders the data non-comparable, where for example Twitter reports nearly 130,000 reports per million users, Facebook only recorded 17 reports per million users which is a clear indication that the numbers do not adequately reflect the scale of the issue 37  

36To better contextualise the online component of such illegal activities, the Commission ran a Flash Eurobarometer survey 38 among a random sample of over 30,000 internet users in all Member States, testing user perception of the frequency and scale of illegal activities or information online. 60% of respondents thought they had seen at least once some sort of illegal content online. 41% experienced scams, frauds or other illegal commercial practices. 30% thought they had seen hate speech (according to their personal understanding of the term), 27% had seen counterfeited products and 26% has seen pirated content. These categories are consistently the highest in all Member States, with some variations.

 

Figure 2: Most frequently seen types of illegal content on online platforms. Flash Eurobarometer on illegal content online, 2018 (N= 32,000 respondents)

Services concerned in the spread of illegal activities are diverse in nature and size

37There are several ways through which digital services contribute to illegal activities online. First, digital service providers (e.g. websites of online shops, content apps, gambling services, online games) can infringe the law themselves, frequently by misleading and scamming consumers, or by selling illegal products. This remains a persistent problem. This is, to a large extent, an issue of enforcement and almost 80% of all notifications and assistance requests sent by Member States for cross-border issues concern infringements by such online services 39 .

38Second, with the increased use of online platforms, more opportunities for disseminating and amplifying the dissemination of illegal content, goods or services have emerged. Perpetrators use these services, from hosting content on file sharing services, to disseminating hyperlinks through the most used social network platforms where the widest audiences can be reached 40 . Further, such services are themselves built for optimising access to content or commercial offers, respectively, and their systems can be manipulated and abused to drive users more easily towards illegal goods, content or services. This is even more acutely the case where very large online platforms are concerned, where the highest numbers of users can be reached and where the amplification of illegal content and activity is consequently most impactful. These very large online platforms lack the necessary incentives and oversight to guarantee users’ safety and privacy and to prevent deceptive and fraudulent practices.

39Challenges addressing the scale and the spread of illegal goods, services and content are further amplified by the accessibility of services in the Union offered from providers established in third countries, which are currently not bound by the E-Commerce Directive. 41

Box 3: Examples of misuse of online intermediary services for disseminating illegal content

According to INHOPE 42 , 84% of child sexual abuse material (CSAM) is shared through image hosting websites, 7% through file hosts, 5% on other websites and 4% through other services, including social networking sites or , forums or banner sites. NCMEC data shows that, while the highest shares of the reported content comes from Facebook and its subsidiaries, including its private messaging services, largely due to the fact that Facebook are taking active steps to find CSAM. It is expected that large numbers of CSAM material is also shared on a variety of other services of different sizes.

For terrorist content, the 2018 Impact Assessment accompanying a proposal for a Regulation on Terrorist Content 43 contained some relevant data. Out of some 150 companies to which Europol had sent referrals, almost half offered file hosting and sharing services (mostly micro-enterprises), and the rest were mainstream social media, web hosting services, as well as online media sharing platforms (both big and medium-sized enterprises). Overall, one out of ten companies was a medium or large enterprise, whereas the rest were small and micro enterprises. In terms of the volume of content, 68% of Europol referrals were addressed to micro, small and unregistered companies in 2017.

b)Emerging systemic societal risks posed by online platforms

40Online platforms pose particular risks, different in their nature and scale from other digital services. With the staggering volumes of information and commercial offers available online, platforms have become important players in the ‘attention economy’ 44 . Very large platforms now have a systemic role in amplifying and shaping information flows online and for the largest part of EU citizens, businesses and other organisations. This is at the core of the platform business model: matching users with, presumably, the most relevant information for them, and optimising the design to maximise the company’s profits (through advertising or transactions, depending on the type of platform).

41At the same time, their design choices have a strong influence on user safety online, the shaping of public opinion and discourse, as well as on online trade. Illegal content shared through such platforms can be amplified to reach wide audiences. 45 Particular challenges emerge where content is disseminated at a significant speed and scale across platforms, as it was the case with the terrorist attack in Christchurch 46 , with the potential to incite further violence, and with severe damage to the victims and their families.

42Risks, however, go beyond the spread of illegal activities. Negative effects also stem from the manipulation of platforms’ systems to amplify, oftentimes through coordinated attacks and inauthentic behaviours, certain messages or behaviours online. Such practices lead to a deliberate misuse of the platforms’ system for instigation to violence or self-harm (harmful in particular to children and in the context of gender-based online violence), conspiracy theories 47 , disinformation related to core health issues (such as the COVID-19 pandemics or vaccination), political disinformation, etc. Certain practices may also have negative impacts on users’ freedom to make informed political decisions and on authorities’ capacity to ensure open political processes. Similar amplification tools, either through algorithmic recommendations or design ‘dark patterns’ can also tilt consumer choice on marketplaces and have an impact on sellers’ ability to reach consumers 48 .

43This amplification happens through the design choices in platforms’ ranking systems on embedded search functions, recommender systems, and through more or less complex advertising placement services, including micro-targeting.

44Such issues stem from at least two potential sources:

45First, structurally, the optimisation choices made by platforms in designing their systems and choosing the content amplified and matched with their users could, in themselves, lead to negative consequences. There is, for instance, debated evidence for the creation of ‘filter bubbles’ on social networks, where users are only exposed to certain types of content and of views, affecting the plurality of information they receive 49 .

46Micro-targeting with political advertising, for example, is also alleged to have similar effects, in particular in electoral periods 50 , but evidence of actual impact in voter behaviour is not consistently conclusive 51 . Advertising can also be served in a discriminatory way, in particular where vulnerable groups are deprived from sensitive ads such as those related to access to goods or employment 52 .

47Second, as systems are dynamically adapting to signals they pick up from their users, they are vulnerable to manipulation by ill-intended individuals or organised groups. For example, bot farms are used to artificially increase traffic to certain types of content, either to drive ad revenue, or to fake the popularity of the content and trick the amplification algorithm into systematically ranking it higher. The behavioural aspects leading to abuse, as is the case in disinformation campaigns, go beyond the systemic issues analysed in this impact assessment. 53

48It is clear that the dynamics of online interactions have an impact on real world behaviours. However, extensive academic research 54 , replies to the open public consultation from civil society, academics, some business associations and regulators pointed to significant shortcomings in the understanding and detection of risks and harms stemming from the amplification of information flows through recommender systems, ranking or advertising.

49First, users lack meaningful information about how these systems function and have very little agency in their interactions with these systems. They are limited in understanding the source of the information, as well as its relative prominence. Direct information to consumers is also an issue for consumer choices, as illustrated by the EU Market Monitoring Survey 2019, which shows that in the market for holiday accommodations 62% of EU 27 consumers consider ranking of products in search results very or fairly important and 72% consider online reviews and comments very or fairly important for choosing goods and services 55 .

50Second, there are very few ways of researching and testing the effects of such systems. Much of the evidence and information about harms relies on the investigations and willingness to cooperate of online platforms themselves 56 . Some research projects and civil society experiments attempt to observe platforms’ algorithmic systems and their effects 57 , but they require significant efforts to collect data, sometimes against the terms of service set by the platforms. They naturally focus on specific platforms. Such research and experiments fail to meaningfully observe and account for the iterative interactions between the learning systems, the online behaviour of users, and the governance set by the platforms, and cannot offer the continuous monitoring necessary to understand the systems. 58  

c)Fundamental rights are not appropriately protected

51There is however an important balance to be struck between measures taken to remove illegal content and the protection of the fundamental right, especially freedom of expression and freedom to conduct a business. When platforms remove users’ content, services or goods offered for sale, or de-rank them or otherwise limit access, or suspend user accounts, this can have severe consequences on the rights and freedoms of their users. This affects in particular their freedom of expression and limits access to information, but also on freedom of businesses and their ability to reach customers. These decisions are often not based on an assessment of the legality of the content, nor are they accompanied by appropriate safeguards, including justifications for the removal or access to complaints mechanisms, but they are solely governed by the discretionary powers of the platform according to the terms of services that are part of their contractual terms.

52In some cases, content can also be removed erroneously, even if it is not illegal, nor in violation of the terms of service. Such cases can stem, for instance, from erroneous reporting by other users 59 and abusive notices, as well as from platforms’ own detection systems, not least when automated tools are used. Notorious examples include takedown of historical footage used for educational purposes 60 or documented evidence from war zones 61 .

53Some regulatory initiatives, such as the Platform to Business Regulation 62 , oblige online platforms to inform their business users of different aspects of their commercial relationship and provide those users with an effective complaint mechanism, as well as an out of court dispute settlement mechanism. Following the adoption of the revised Audiovisual Media Services Directive (‘AVMSD’) 63 , for the specific area of audiovisual content on video-sharing platforms, other users will also have access to redress mechanisms to be set up by video sharing services, and to alternative out-of-court redress mechanisms to be set up by Member States.

54However, other users depend entirely 64 on the platforms’ discretionary decisions as to whether they can have access to a complaint and redress mechanism. In the recovery steps of the COVID-19 pandemic, in particular, the effects of erroneous restrictions on businesses can be significant. Furthermore, the effectiveness of such systems varies greatly from one service to the other or from one type of content to the other. There is consistently a lack of redress and transparency of decisions unilaterally taken by the platforms. Very few online platforms subject their enforcement policies to systematic independent oversight (15 out of 61 service providers responding to the open public consultation).

55It is virtually impossible to estimate in quantitative terms the scale of erroneous removals, blocks or restrictions. Very few online platforms report on the complaints they receive from users. Even fewer report on whether content is reinstated after such complaints are acted upon. Even then, such reports are not comparable, since they do not follow the same standards and criteria.

56In the Eurobarometer survey run by the Commission in 2018 65 , 5% of citizens responding said their content was erroneously removed, reaching 10% of respondents from Poland, 8% in Denmark, and 7% in Greece, Cyprus and Malta. 22% of the respondents concerned by the removals were not informed in any way about the reasons why their content was removed and 47% said they took no action to resolve the situation.

57On many occasions, erroneous removals can have a chilling effect on the users’ freedom of expression online beyond the specific content removed: in a survey 66 presenting theoretical takedown scenarios, 75% of respondents said they would be less likely to speak about certain topics online after their content is removed from a platform. More recent empirical research 67 has confirmed these behavioural changes on social networks. When marketplaces - which intermediate e.g. the sale of products of any type, accommodation services, transport services – take corrective measures against their sellers for alleged illegal activities, errors or ill-intended notices can have a substantial adverse impact on individual businesses and traders, in particular when these are largely dependent on these marketplaces and online channels for reaching their customers. On the other hand, a lack of effective procedures that may result in illegal content not being taken down may also have a considerable negative impact on fundamental rights, for example in the case of child sexual abuse material where known content re-surfaces and the harm is perpetuated.

58Takedowns are potentially even more impactful when such measures are taken by services lower in the Internet stack, such as those providing the cloud infrastructure, web hosting, or content distribution network services, for instance. Actions taken in these cases can effectively disable access to entire services, blocking IP addresses, taking down full websites or rendering them inaccessible and/or vulnerable to Denial-of-Service attacks.

59At the same time, fundamental rights are also at risk when users are prejudiced when service providers do not take any action, leaving content untouched that severely violates interests of others, including in cases where users have reported such content (see previous section a) here-above).

Who is affected and how?

60The overall impacts of these problems are very broad and deeply connected to the various illegal activities themselves, and more broadly affecting behavioural patterns and functions of the online participation. It is outside the scope of the impact assessment report to present them in detail. As an illustration the most commonly reported issues referred to in the replies to the open public consultation are presented in the paragraphs below.

61Illegal activities online have a serious impact on safety and security online which can also lead to offline consequences. CSAM content, and material that incite to terrorist acts or racists and xenophobic or gender-based violence affect important fundamental rights including the right to life and human dignity and the rights of the child. Other illegal activities can have an impact on consumers, which are affected by scams and misleading practices, or purchase dangerous and non-compliant goods. They can also affect legitimate businesses, either scammed themselves online, or as manufacturers, brands, content creators and other IPR owners, losing important revenues due to substitution of their offerings with illicit ones, as well as potentially suffering reputational damage. Illegal activities online also represent a competitive disadvantage for compliant businesses.

62The proliferation of illegal content online can also have the effect of silencing speech, in particular where vulnerable groups are concerned. At the same time, erroneous removal of content can have important consequences on citizens’ freedom of expression, as well as on businesses’ ability to reach consumers and their freedom to conduct business.

63When online intermediaries, such as online platforms, are concerned, the presence of illegal activities conducted by their users has controversial effects. In the public consultation, some respondents, in particular holders of IPR, flagged that illegal activities bring significant income to online platforms. At the same time, platforms and other intermediaries stated that when illegal activities are harming their users they may suffer reputational damage and loss of revenue, as well as incur legal risks from the service they provide. Recent developments, such as advertisers’ walk-outs from certain platforms, point to the complexity of repercussions also on the intermediary’s business practices and interests.

Stakeholders’ views

In the open public consultation, a majority of respondents, all categories included, indicated that they have encountered illegal content, goods or services online, and specifically noted a spike during the Covid-19 pandemic. More specifically, 46% of the respondents to the relevant question indicated, that they had encountered illegal goods, and 67% of the respondents stated, that they had encountered illegal content online. Citizens and consumer organizations pointed to defective goods, counterfeits, fake event tickets, as well as significant issues related to hate speech, political disinformation and fake news. Business organizations and business associations raised the issue of online scams, as well as losses incurred due to intellectual property infringements. A large share of respondents who said they had notified illegal content or goods to platforms, expressed their dissatisfaction with the platforms’ response, and the ineffectiveness of reporting mechanisms after the exposure took place. More specifically, 54% of the respondents, all categories included, were not satisfied with the procedure following the reporting, were not aware of any action taken by the platform as a follow up on their reporting and consider that there is a lack of transparency following a notification. In addition, citizens pointed out, that notice and action procedures are very different from one platform to another, making the procedure of reporting illegal content/goods/services even more difficult and uncertain. Moreover, especially users and civil society organisations perceived there to be a mismatch between platforms’ official policies and their concrete actions, and called for harmonised rules for digital services providers. Civil society organisations highlighted the significant information asymmetries between users and platforms, and academic institutions warned against the negative effects that amplification systems can have on the dissemination of illegal activities. With regard to the use of automated tools in content moderation, several respondents, especially business associations and online platforms pointed to both the usefulness and the limitations of such tools. There is a strong call for caution for obligations for the use of these tools due to risks of over removal of legal content by civil society organizations defending digital rights. Publishers, companies that sell products or services online, the general public, as well as digital users’ and consumers’ associations expressed concerns about the lack of transparency and accountability, especially in the context of targeted advertising and how algorithmic systems shape online content. Furthermore, the limited disclosure of ad content and the lack of ad targeting policy enforcement was flagged.

Moreover, whilst there is a strong call for action, many categories of stakeholders, including citizens, online intermediaries, civil society organisations, academic institutions and national authorities, emphasized that any new measure to tackle illegal content, goods or services online, should not lead to unintentional, unjustified limitations on citizens’ freedom of expression or fundamental rights to personal data and privacy. Citizens, civil society organizations and consumer organizations pointed out the need for platforms to have a clear and transparent redress mechanism. Digital users’ associations highlighted that the users have no way to appeal to anyone independent or neutral.

Ineffective supervision of digital services and lack of trust between authorities

In particular where online platforms are concerned, the supervisory system is to a large extent uncoordinated and ineffective in the EU, despite the strategic importance of such services. The E-Commerce Directive sets the internal market principle according to which the supervision of digital services is organised, but remains broad on the general principles for cooperation and information sharing across Member States. The perceived limitations in the day to day cooperation fuels a lack of trust across Member States when it comes to supervising online platforms in the interest of all EU citizens. In turn, this mistrust leads to an uneven protection of European citizens, and to uncertainties and lack of clarity for service providers.

64Whereas online platforms and, to certain extent, online intermediaries at large, are misused for the harms presented here-above manifesting in different Member States, the current supervision arrangements across the single market are not effective, and are insufficient in mitigating the evolving risks. There are some sector-specific cooperation mechanisms which benefit from further specified procedures, such as in the area of consumer protection. However, overall there are several components fuelling this situation:

65First, a core failure in supervision of digital services stems from the lack of trust and cooperation among authorities in cross-border issues. Online platforms are naturally borderless, in particular where they reach a critical mass of users for a competitive service. The core principle of the single market aims at establishing the most effective supervision in order to safeguard the interests of all the European citizens. The country of establishment is best placed to take corrective measures against a service provider, while accommodating the cooperation with and assistance to authorities from other Member States 68 . For a smooth functioning of the system, Member States need to trust that digital services are effectively supervised at the source of the activity. To that end, it is necessary to ensure that the competent authority provides such protection not only for the citizens of its own country but for all citizens in the EU. At the same time, for the case of services such as online platforms, the underuse of the cross-border mechanism designed in the E-Commerce Directive, causes deficits in the supervision of online platforms and has eroded trust between Member States (see driver 2.3.6).

66In both the open public consultation and the targeted consultation with Member States, the majority of authorities pointed to the increased importance of the cooperation across the single market. At the same time, they deplored the very limited use of existing channels, slow processes and response from other authorities, as well as the lack of clarity as to which cooperation mechanism should be used for specific and general issues. Some authorities emphasised the lack of a stable forum and incentives for Member States to share real progress and information. Further, they flagged the ever-increasing complexity of issues supervised at regional, national and European level, all sharing cross-cutting digital challenges, and the need to ensure the cooperation and transmission of information across these levels within and across Member States.

67Absent an effective cooperation mechanism and as the risks have escalated with the scale and impact of online platforms, Member States have started to re-fragment the single market and legislate unilaterally to tackle these issues (see driver 2.3.1).

68Second, authorities lack information and technical capability for inspecting technically complex digital services. This concerns both the supervision of the digital service and, in the case of online platforms in particular, the increasing challenges of supervising the underlying services they intermediate, such as accommodation or transport services, or websites conducting illegal activities online.

69Third, authorities have very few levers for supervising services established outside of the Union, while such services are easily used e.g. for selling illegal goods or scamming consumers in the Union. Several authorities responding to the consultations launched by the Commission emphasised this grey area of regulatory supervision, where important services established outside the Union bear no legal obligations, whereas they reach a large number of Europeans.

Stakeholders’ views

Several stakeholders groups, including public authorities, as well as different Member States pointed out that cooperation between authorities and enforcement is inadequate both cross-border and within each Member State. Member States, public authorities, civil society organisations and brand associations emphasized the need for the current system to be strengthened, and pointed to the knowledge gap, the inadequacy of existing cross-border mechanisms, and the lack of oversight and cooperation between all actors involved in the ecosystem as a key hindrance in effective oversight. Some national authorities considered that the country where a service is accessed does not have sufficient levers for enforcing its laws online. Businesses and business associations bemoaned that regulatory oversight is neither clear nor foreseeable, and especially highlighted the regulatory burdens of complex, slow and bureaucratic procedures. Civil society groups referenced that the current governance approach is not broad enough and highlighted the need to cooperate with civil society organizations and academic as well as research institutions for specific inquiries and oversight, in particular where online platforms are concerned. Some civil society organizations flagged the absence of robust and effective enforcement mechanisms for regulatory oversight, in particular when it comes to fostering coordination between national authorities, and to address issues concerning the lack of transparency and inconsistencies within procedures. Similarly, the majority of respondents from academia pointed to the fact that platforms cannot credibly be held accountable without strong enforcement mechanisms. Finally, a majority of categories of stakeholders considered that in order to effectively supervise online platforms, rules should be applicable to third country players that are providing their services to European users. Online intermediaries stressed that any regulatory oversight mechanism should be proportionate, increase legal certainty, and follow a single market logic in ensuring the free provision of services.

Legal barriers for digital services, prohibitive for smaller companies to scale in the European single market

To address the challenges presented here-above, Member States have started regulating online platforms and online intermediaries at national level to supervise them and reduce harms. The resulting legal burdens create new barriers in the internal market and lead to high direct and opportunity costs, notably for SMEs, including innovative start-ups and scale-ups. This leads to a competitive advantage for the established very large platforms and digital services, which can more easily tackle higher regulatory compliance costs, and further limits the ability of newcomers to challenge these large digital platforms.

70Some Member States are increasingly legislating to protect their citizens from those risks generated by online platforms established in a different Member State. When companies want to provide their services cross-border in the single market, they face a series of regulatory burdens: legal fragmentation across Member States and legal uncertainties.).

71The impact on online platforms is asymmetric and disproportionately affects small providers. While larger online platforms are also subject to more costly obligations, those costs are still comparatively modest for them. In contrast, they can be prohibitive for start-ups and scale-ups attempting to provide services in several Member States and develop in the single market.

Cost of non-Europe

72In a direct cost model 69 , company-level costs stemming from the legal fragmentation range from EUR 31,000 to EUR 15 million per year for a small-sized enterprise (depending on the Member States where the company provides its services, as well as the overall volumes of content notified to them). For larger companies which also receive larger volumes of notices (from 200 to 3000 per day) and require a more robust infrastructure for processing them, costs can range from EUR 1,3 million to EUR 225 million per company. Simulating the effects of the ascending trend of legal fragmentation, all of these costs could double, should Member States continue to legislate in diverging ways. This model is only reflecting direct costs of the evolving legal fragmentation, accounting for the different rules on ‘notice and action’ obligations for online platforms, including the notification system, processing of notices, and, where required, availability of a counter-notice system, transparency requirements and the obligation to appoint a legal representative in different Member States.

73 With the evolving fragmentation, these costs can have an impact on the over 10,000 potentially high-growth platforms established in the EU 70 , out of which around 96% are SMEs, more than half of which are micro-enterprises. For micro and small size enterprises, it is clear that the current costs are prohibitive for covering the entire single market. This is particularly concerning for digital services which typically need to draw on economies of scale to grow fast in order to secure their place on the market.

74Across all sectors, the current state of the legal fragmentation is estimated 71 to represent a loss of 1% to 1.8% in online trade (i.e. modelled as cross-border use of online platforms, based on cross-border users for 31,084 web domains).

Legal uncertainty

75Other legal burdens stem from the uncertainties linked to the liability regime for online intermediaries (see driver 2.3.5). This leads to a risk-avoidance behaviour in particular from small, emerging service providers, and decreases the quality of their service and their potential for a competitive edge, as testified by several service providers e.g. in their responses to the public consultation.

76Consequently, direct costs from legal fragmentation are also accompanied by potential opportunity costs and missed potential for business innovation. In comparison to other countries, such as the US or China, the level of investment in European market places is significantly lower. However, scenarios based on data from venture capital investments show that there is potential growth for online platforms (16% increase in investment in 2019), in particular where platforms offer services linked to food, transport, fintech, travel, fashion, home, or enterprise software. 72 With increased compliance costs due to the growing legal fragmentation, the legal risks for start-ups and scale-ups have a chilling effect on investment and can dissuade businesses from expanding and growing in the single market.

Stakeholders’ views

There is a convergence of views amongst business associations, companies, as well as Member States, that the current state of legal fragmentation of the Digital Single Market has created burdens for European businesses. These stakeholder groups see the trend of Member States enacting different legislations and rules around illegal content, goods and services as limiting most businesses, but especially SMEs and start-ups, from scaling up. More specifically, business associations pointed out, that SMEs and start-ups are facing a competitive disadvantages, since they are affected in a disproportionate manner as opposed to larger companies. Start-ups and SME’s pointed to the business risks of having to adapt their services to potentially 27 different MS-specific rules, which does not just inhibit their growth within EU borders, but also globally. Some business associations further explained that new digital services are often reluctant to expand in different European markets as a consequence of the diverging national legislations. More generally, 64% of respondents, all categories included, that replied to the relevant question in the public consultation considered the different processes and obligations imposed by the different Member States for notifying, detecting and removing illegal content, goods, or services as very burdensome, and 72% of respondents considered the different procedures and points of contact for obligations to cooperate with authorities as very burdensome. This issue is also recognised by national authorities, which support a horizontal harmonised framework to tackle fragmentation stemming from national and EU legislation.

Some intermediaries, national authorities, research institutes and civil society organisations consider that the current liability regime creates disincentives to act and call for the removal of disincentives for voluntary measures, in order to limit the risks of liability for intermediaries that voluntarily implement preventative measures to detect illegal content. Business associations and companies agreed, that the liability framework should be further developed in an innovation-friendly and uniform manner throughout Europe. Online platforms echoed the need for clear but proportionate rules and responsibilities that do not disincentive their voluntary actions to limit the distribution of illegal activities online. Some digital users’ associations, trade associations and representatives of the creative industry fear that such clarifications could weaken the responsibilities of intermediaries, absent positive obligations.

2.3.What are the problem drivers?

Private companies make fundamental decisions with significant impact on users and their rights

77Beyond responding to calls to remove content that is illegal, online platforms generally apply their terms of service and community standards, both for specifying what types of content and behaviours they allow, and by setting up the process for reporting and detecting non-compliant behaviours.

78There is no oversight and generally an absence of checks and balances provided by law. This concerns equally decisions taken by service providers based on their terms of service, as well as measures put in place to tackle illegal activities following flags from third parties or proactive measures for detecting such activities. This leaves citizens’ rights vulnerable. The opacity of this system also weakens the ability of authorities and law enforcement to supervise and pursue online crimes.

79The very large online platforms generally have in place a system for notifying content, goods or services they intermediate, but the actions triggered are not always consistent. Other, smaller players do not support any notification system at all (two small service providers, out of 60 online intermediaries responding to the open public consultation did not have any such system). The rigor in analysing the reported or detected content varies. Some studies 73 have shown that, when content is notified to platforms and the claim appears delicate or uncertain, they will likely remove the content to avoid risks of liability. This concerns in particular smaller online platforms, where business incentives to keep illegal content off their service are overthrown by the need to avoid legal risks. Conversely, parts of civil society, brand owners or authorities also complain that platforms are not systematically responsive to notifications about illegal content.

80Further, platforms’ own actions and tools for content moderation are not consistently accurate and there are very few possibilities to inspect the reliability of their systems. The largest platforms use both in-house and outsourced content moderation, including a range of technologies, from metadata and keywords trackers to infer illegal goods sold on their platforms, to fingerprint-based filters to detect illegal images previously identified, to machine-learning classifiers claiming to identify automatically certain types of content. The use of such tools, while promising in churning large volumes of content very fast, brings a set of challenges in particular with regard to more context-sensitive content. As concluded in a study commissioned by the European Parliament, ‘such measures present significant drawbacks, including a lack of transparency concerning how the technologies work, a lack of adequate procedural safeguards and a risk of over-enforcement, with online providers being more likely to apply an algorithm that takes down too much rather than too little, content’. 74  

81Key components to safeguard users’ rights, such as meaningful information to the user whose content was removed and to those that filed a notice, or an appropriate complaint mechanism, are also not consistently applied (three of the online intermediaries responding to the open public consultation), and are not equally reliable across services. 17% of respondents to a Eurobarometer survey, 75 whose content was erroneously removed by online platforms, also said that they were never informed by the platforms about the reason for the removal.

82A quarter of the online intermediaries responding to the open public consultation said they did not have policies or identification measures for their business users established outside of the Union. Such measures are considered best practices 76 to dissuade illicit sellers and to enable the enforcement of sanctions against them. On-boarding processes for traders differ for each online marketplace: whereas some are asking for detailed information on the identity of the traders, others require a mere email address. Consumer protection authorities have also often reported their difficulties to enforce the law against rogue traders online due to the lack of information on the identity of such traders, especially when they are not established in the EU.

83The large online platforms release regular transparency reports. This practice has increased since the Commission’s Recommendation of 2018 77 . While it is important for such information to be released, not least as concerns requests from government authorities and content detected through user notices and proactively identified by the platform, these reports remain limited in scope and detail, making it difficult to understand to what extent illegal content, goods and services are appropriately identified and removed. They are not standardised, use different definitions and different metrics for the data reported, and can hardly be compared across services.

84 Some online platforms 78 are starting to set up additional structures in their decision-making on content moderation, with an oversight board formed of external experts, to judge on the most difficult user complaints against removal. Such structures have been praised as a sign of inclusiveness in making decisions with a societal impact, while at the same time criticised for the limited prerogatives given to the boards. 79  

Very large platforms can play a role of ‘public spaces’

85With the scale of some online platforms and their presence in increasing facets of our daily lives, they can sometimes be compared to public spaces for expression and economic transactions. They are key actors in facilitating the exchange of information and the exercise of freedom of expression on the internet, and consequently they present the highest societal and economic risks. With over half of the population in the EU using social media, reaching nearly 90% for those aged 16-24 80 , the effects of the design and standards on these platforms have a wide reaching societal and economic impact. Over 50% of business use social media in Europe, in some countries this rises to nearly ¾ of all the companies established 81 . The main marketplaces attract millions of sellers, who depend on them for reaching their customers. 82 Product updates, product tests or errors can make or break entire revenue streams of companies whose traffic and visibility is to a substantial extent dependent on these platforms.

86The overwhelming majority of users are centralised today in a small number of online platforms. While precise data on the number of users is not available, available data shows the staggering differences of scale in the reach of the few largest online platforms and the long-tail of other services. 83  

87The business model of platforms is predominantly based on capturing the attention of users in the increasing volume of information, goods and services. These services operate to their benefit with strong network effects, economies of scale, and unmatchable access to user data. Reaching a certain number of users has enabled a self-fuelling exponential growth for a relatively small number of platforms, leading to an extreme concentration of users and market power. Ad spending for digital advertising has grown over 10 times since 2006, with 12.3% growth only in 2019 with a total of EUR 64.8 billion in Europe 84 . The revenues disparities based on online advertising streams are also staggering: search ads is consistently the biggest category of digital advertising, whereas video, social and mobile advertising are growing very fast.

88The tools and mechanisms used to optimise engagement play a major role in shaping the information and goods we see, bringing with it a variety of risks and harms exasperated by the scale at which they operate. The recommender algorithms and tools developed for businesses and platform to capture the attention of users and consumers can have design flaws leading to unintended consequences with serious social impact 85 , for example, studies have shown that algorithms on advertising-funded platforms prioritised disinformation in part because of its engagement rate and consequential attractiveness to advertisers 86 . At the same time, the systems can be ‘gamed’ to propagate illegal content and goods by malicious actors as well as to spread false narratives by way of computational propaganda: micro-targeting, bots, astroturfing 87 and search engine optimisation. 88

89The societal risk of exposure to illegal content and activities is particularly high on large online platforms reaching a very wide audience. Strategies of the largest platforms have enormous impacts on the safety of citizens and the fairness of businesses’ commercial activities online. At the same time, tackling illegal content and the related harm on these large platforms is challenging because they have become public spaces for exchange of information and thereby freedom of expression in an ever-more digital society, without being responsible for any considerations of public interest.

90Given these network effects and unmatched access to data, there is significant information asymmetry between large platforms, small businesses, citizens and public authorities. There is insufficient transparency and accountability around how design decisions of platforms have societal and economic impacts.

Legal fragmentation

91The E-Commerce Directive sets the general framework for digital services established in the single market. The Directive harmonises the basic information requirements for digital services and liability exemption for online intermediaries. It does not prescribe procedures or obligations on service providers when it comes to the notification and removal of illegal content, but flagged already the need to explore such measures (Article 21 (2)).

92Since the adoption of the Directive, the digital services evolved significantly, together with the scale of their use. Online platforms in particular pose increasing risks and challenges. To address this, in the absence of common rules, Member States are legislating unilaterally, fragmenting the single market and triggering a series of inefficiencies and ineffectiveness in the supervision and sanctioning of digital service providers.

93The largest source of fragmentation comes from the rules established at national level for procedural obligations for online platforms to address illegal information and activities conducted by their users, as follows:

94Nine Member States (Finland, France, Germany, Greece, Hungary, Italy, Lithuania, Spain and Sweden) have implemented a notice-and-action procedure in their legislative frameworks. For five of them this only applies to copyright infringements and related rights thereof. In some (e.g. Germany) more recent laws apply specifically to certain categories of hate speech.

95In several Member States (Finland, France, Hungary, Lithuania), minimum requirements for the notice are defined by law, to ensure that it is sufficiently motivated.

96In Member States without statutory requirements for notices, the case law has provided indications concerning the content of the notice and the mechanism.

97The precise requirements of these laws diverge to a large extent on several points: the minimum content of the notice, the possibility to issue a counter-notice, the timeframe to react to a notice, potential mandatory measures against abusive notices or the possibility to submit contentious cases to an independent third party. Consequently, the service providers concerned can be subject to a range of legal requirements, which diverge as to their content and scope.

98In thirteen Member States, some form of opportunity to dispute the allegation exist. However, the situation in which counter-notices are possible differ greatly amongst Member States. For example, a counter-notice in Estonia is only possible when the removal order is ordered by a government agency. In Finland, Greece, Hungary, Ireland, Italy and Spain counter-notices are only possible in the context of copyright; and in Luxembourg, it is only possible during the merit procedure.

99In eight Member States (Bulgaria, Estonia, France, Germany, Greece, Lithuania, Portugal and Sweden), some sort of alternative dispute settlement mechanism exist. For example in Portugal, there is an out of Court preliminary dispute settlement possible in case the illegality of the case is not obvious; in Estonia, a specific alternative dispute regime exists for copyright infringements, in which a specific committee can resolve disputes.

100In addition, several, more recent laws were adopted or proposed, including a burdensome requirement for a service provider to appoint a legal representative in the respective Member State, even if already established elsewhere in the Union. This is the case in the German NetzDG, the French Hate Speech Law 89 , the recently notified Austrian draft law to combat hate speech online 90 , the German draft law to protection of minors 91 or the Italian “Airbnb” law 92 .

101Additional sources of fragmentation stem from the need for authorities, in particular at local level, to supervise and collect data related to accommodation services offered through online intermediaries (see Annex 6).

Regulatory gap: systemic issues are not appropriately addressed

102In the last years, in response to various sector specific challenges, legislation has been adopted or proposed at EU level. For certain types of illegal or suspicious activities, recent EU legal acts include a series of targeted obligations on online intermediaries. They define specific legal obligations for issues such as copyrighted content 93 , the sale of explosive precursor chemicals 94 , and other types of illegal products subject to market surveillance 95 .

103While all sector-specific legislative initiatives fulfil their aim to tackle the specific issues, important gaps remain on a horizontal level. None of these instruments provides fully-fledged rules on the procedural obligations related to all types of illegal content and the accountability and oversight mechanisms are by default limited to sector they regulate. In terms of scope, they are limited from two perspectives. First, these interventions address a small subset of issues (e.g. copyright infringements, terrorist content, child sexual abuse material or illegal hate speech, some illegal products). Second, they only cover the dissemination of such content on certain types of services (e.g. sub-set of online platforms for copyright infringements, only video-sharing platforms and only as regards audiovisual terrorist content or hate speech in the AVMSD).

104With regard to online advertising services for example, the E-Commerce Directive sets a series of transparency and disclosure obligations on distinguishing the ad from other content, and on the identity of the advertiser, complemented by similar provisions in consumer law. 96 However, the provisions are limited to commercial communications and online advertising landscape has changed dramatically since the Directive was adopted.

105For some categories of illegal content and activities, such as illegal hate speech, dangerous products or counterfeits, the Commission has facilitated self-regulatory mechanisms, including cooperation with national authorities and/or trusted third parties (e.g. the Code of conduct on hate speech 97 , the Product Safety Pledge 98 , the Memorandum of Understanding on the sale of counterfeit goods on the internet 99 , the Memorandum of Understanding on online advertising and intellectual property rights 100 ). These voluntary measures have been to some extent effective in terms of achieving effective removals and by fostering collaboration between Member States, platforms and civil society. They have, however, structural limitations in scope and scale: they are limited to the signatories of the measures and compliance with the agreed objectives cannot be appropriately supervised or sanctioned, given their voluntary nature.

106The Recommendation of 2018 fleshed out procedural requirements that the sectorial legislation had not fully addressed. It included horizontal procedures for notice and action mechanisms, safeguards for users’ rights and transparency.

107These non-binding measures were only selectively applied by some services. For instance, several respondents to the open public consultation noted that reporting illegal goods is not easy for the majority of the users, both in terms of ease of finding the avenue for reporting as well as the procedure of submitting a report. Several hosting service providers responding to the consultation said that they did maintain a system for users or third parties to flag illegal activities conducted on their service. Further, users have often reported that these mechanisms are very different from one platform to the other: the procedure can vary from a simple email to a complex portal, removal times vary and follow-up actions are not always provided. Furthermore, the Recommendation has not had a harmonising effect: Member States proposed legislation with diverging measures in the national legal drafts analysed so far.

108As such, systemic elements remain unaddressed by the regulatory framework and the self-regulatory initiatives. There are no comprehensive rules across the single market, neither in national law (see driver 2.3.1), nor at EU level specifying the responsibilities of digital services, including online platforms.

Legal uncertainties and contradictory incentives

109There are several sources of legal uncertainty for online intermediaries, as their business models or the underlying technologies have developed since the entry into force of the E-Commerce Directive.

Uncertainties

110Over the years, an important area of legal uncertainty for digital service providers has been the scope of the definition of information society services. Especially in the area of collaborative economy, but also in the area of sales of goods online, the line between the online services, offered at a distance, and the underlying services, usually offered offline, has not always been clear. The consequences of the separation of these services are significant given that online services may fall within the scope of the E-Commerce Directive while the underlying services fall within sector-specific rules or horizontal EU legal acts, such as the Services Directive 101 . Operators have often mentioned such legal uncertainty as a source of concern for their growth. The relevant provisions of the E-Commerce Directive have been recently interpreted by the Court of Justice of the EU 102 .

Liability regime for online intermediaries

111The liability regime set in the E-Commerce Directive for online intermediaries is considered a cornerstone for allowing online intermediaries to emerge in the 2000s, but also to establish the right incentives for service providers not to be driven to interfere disproportionately with their users’ freedom of expression, as well as the freedom of their business users.

112First, the Court has interpreted the condition for hosting services to ‘a passive and neutral role’, as referred to in Recital 42 of the E-Commerce Directive for mere conduits and caching services 103 – and national courts have later on applied this case-law in contradicting ways. In this context, some national courts have equalled ‘active role (of such a kind as to give it knowledge or control)’ with a sort of ‘appropriation’ of the content (‘zu eigen machen’) to the extent that a reasonably informed user could conclude that the platform is the author or responsible for such content. Similar interpretation has been also proposed very recently by Advocate General Saugmandsgaard in a case which is currently pending before the Court 104 . When applied to hosting service providers, it is important to create legal certainty and ensure that this requirement cannot imply that automatic, algorithmic ordering, displaying, and tagging or indexing of the content it stores, activities that are today necessary to make such content findable at all, imply an active role.

113Third, and related to this, the current regime entails some legal uncertainty, in particular for small players that might want to take measures for keeping their users safe, but, in order to escape legal risks, avoid doing so. The current legal framework under the E-Commerce Directive could be interpreted as creating contradictory incentives for service providers: proactive measures taken to detect illegal activities (even by automatic means) could be used as an argument that the service provider plays an ‘active role of such a kind as to give it knowledge of, or control over, the data’ controlling the content uploaded by their users, and therefore cannot be considered as to fall within the scope of the conditional liability exemption. This places small players, who cannot afford the legal risk, at a net disadvantage as compared to large online players which do apply content moderation processes to varying degrees of quality.

114The transposition of the liability regime into national law has also generated some areas of fragmentation, as clarified in driver 2.3.3.

Limited cooperation among Member States and lack of trust

115To ensure that under specific circumstances, Member States are able to adopt measures in respect of a given information society service, even if these would not be established within their territory but in the territory of another Member State, the E-Commerce Directive provides for a specific cooperation mechanism between Member States’ authorities. 105  

116The number of notifications sent by host Member States to trigger the assistance from authorities in the Member State of establishment of a service provider is very low, benchmarked against the surge of cross-border online activities during the last decades. Since the transposition of the E-Commerce Directive there have been 141 notifications submitted through the cooperation mechanism (approximately 30 in the first 9 years after the entry into force of the E-Commerce Directive and 111 notifications from November 2013 and July 2020, through the Internal Market Information System (IMI system) provided by the Commission for electronic submission of requests from Member States) 106 . Only 18 of these concern online platforms, and mostly for consumer protection concerns.

117In several surveys over the last years 107 , Member States have expressed dissatisfaction with several aspects of the existing cooperation mechanism. These include the average timing for responses to Member States’ requests, the quality of the feedback received, and the lack of clarity in the use of other cooperation and notification systems, such as the CPC. All these lead to lack of trust between Member States in addressing concerns about providers offering digital services cross-border, in particular where online platforms are concerned.

118Further, the lack of trust fuels the tendency of Member States to regulate unilaterally. A plethora of national laws (see driver 2.3.3) regulating digital services are coming into force, which leads to the fragmentation of the single market and a limitation to the freedom to provide services, in particular when such laws have extraterritorial effect.

119However, the complex set of issues that the socio-technical systems of online platforms and other digital services are posing cannot be adequately and thoroughly addressed on national level given the cross-border reach of platforms and relatively limited technical resources of national competent authorities. Member States shared that in their experience the existing knowledge gaps, the inadequacy of existing cross-border mechanisms, and the lack the cooperation between all actors involved in the supervision ecosystem, is a key hindrance in effective oversight of online platforms. These challenges were pointed out also in the European Parliament’s resolutions 108 and, in the targeted and open consultations organised, some Member States have pointed to the opportunity of further mutual assistance and EU-level governance.

2.4.How will the problem evolve?

120All the aforementioned problems can only be expected to become increasingly acute. The use of (some) digital services will only increase over time, and so will the risks of abuse and manipulation of these digital environments.

121Illegal and harmful behaviours are consistently evolving. Perpetrators are seeking means to adapt to measures taken by service providers and authorities are active across a series of services or are migrating from larger to smaller platforms. In the current system, primarily based on sector-specific interventions and voluntary measures taken by service providers, interventions can hardly keep up with the agile ways in which services are abused. They will never cover all those services which can make a real difference, nor will they cover all categories of illegal content, goods or services.

122Users will also continue to have virtually no redress when faced with removals or when their notice is left without action. In a context where more and more volumes of content are processed by online platforms, in particular the very large ones, decisions to remove, delist or otherwise restrict content will also become even more impactful for the rights of their users. In response to challenges with illegal content and societal harms, companies will continue to deploy industry led initiatives, with limited safeguards and no public accountability or oversight 109 .

123Member States will continue to legislate unilaterally and increasingly so with extraterritorial provisions, in addressing the emerging challenges of online intermediaries. Legal uncertainty for service providers will increase, due to the increased fragmentation and the patchy interpretation of liability rules by national authorities. It is unlikely that the cooperation mechanisms currently in place will support the necessary coherence. The economic impacts on digital services, their business users and all citizens will be amplified.

124A core issue in the online environment is the information asymmetry between service providers and authorities and the public at large with regard to the manipulation and abuse of their services by their users. Absent further intervention to rebalance this, the gap can only increase wider, weakening the capacity and capability of law enforcement and authorities to intervene. This will lead to a dangerous system, threatening the rule of law and the market balances.

125Furthermore, with systems being increasingly capable of amplifying information online, the complexity and the impacts of these mediated information flows can only grow stronger, with severe repercussions on individual rights – such as non-discrimination and gender equality, right to freedom of expression and freedom to form opinions, privacy and data protection as well as the right to a high level of consumer protection– and more collective concerns – such as democratic participation, media pluralism.

2.5.Problem tree

Figure 4 Problem tree

3.Why should the EU act?

3.1.Legal basis

126Insofar as the EU intervention is likely to take the form of a legislative proposal, the legal basis depends on the primary objective and scope of the proposal. Legal intervention in the area of information society services with a primary goal of ensuring an internal market for these services could be based in Articles 53(1) and 62 TFEU (freedom of establishment and freedom to provide services), in Article 114 TFEU (approximation of laws for the improvement of the internal market) or in a combination of all these Articles. Articles 53(1) and 62 TFEU provide for the adoption of measures to coordinate the provisions laid down by law, regulation or administrative action in Member States on establishing and providing services. These articles allow only the adoption of Directives. Article 114 TFEU allows for the adoption of measures which are considered necessary for the approximation of the provisions laid down by law, regulation or administrative action in Member States which have as their object the establishment and functioning of the internal market. These can take the form of a Regulation or a Directive.

127The E-Commerce Directive, has a combined legal basis of Articles 53(1), 62 and 114 TFEU (Articles 47(2), 55 and 95 of the then Treaty establishing the European Community).

128The primary objective of this intervention is to ensure the proper functioning of the single market, in particular in relation to the provision of cross-border online intermediary services. In line with this objective, the intervention aims to ensure the best conditions for innovative cross-border digital services to develop in the European Union, while maintaining a safe online environment with responsible and accountable behaviour of online intermediaries. To effectively protect users online, and to avoid that EU-based service providers are subject to a competitive disadvantage, it is necessary to extend the scope of the regulatory intervention to service providers which are established outside the EU, but whose activities affect the single market. At the same time, the intervention provides for the appropriate supervision of services and cooperation between authorities at EU level, therefore supporting trust, innovation and growth in the Digital Single Market.

129The new legal instrument would build on the E-Commerce Directive with regards to the freedom of establishment and freedom to provide digital services in the single market, and further approximate rules applicable to intermediary services. Therefore, for either one of the policy options considered, the intervention can be solely based on Article 114 of the Treaty.

3.2.Subsidiarity: Necessity of EU action

130According to the subsidiarity principle laid down in Article 5(3) TFEU, action at EU level should be taken only when the aims envisaged cannot be achieved sufficiently by Member States alone and can therefore, by reason of the scale or effects of the proposed action, be better achieved by the EU.

131Several Member States have legislated on the removal of illegal content online in relation to aspects such as notice and action and/or transparency. This hampers the provision of services across the EU and is ineffective in ensuring the safety and protection of all EU citizens. The Internet is by nature cross-border. Content hosted in one Member State can normally be accessed from any other Member State. A patchy framework of national rules jeopardises an effective exercise of the freedom of establishment and the freedom to provide services in the EU. Intervention at national level cannot solve this problem and has amplified the issues. The need to ensure the best conditions for innovative cross-border digital services to develop in the EU across national territories and at the same time maintain a safe online environment for all EU citizens are goals which can only be served at European level.

3.3.Subsidiarity: Added value of EU action

132The different and diverging legal regimes applicable to online intermediaries increase compliance costs while also being the source of legal uncertainty as to the applicable obligations across the EU and leading to unequal protection of EU citizens. In addition, the effects of any action taken under national law would be limited to a single Member State.

133EU action reducing compliance costs, allowing their predictability and enhancing legal certainty, while also ensuring equal protection of all EU citizens ensures that information society service providers’ actions against illegal content online can be streamlined and scaled up, thereby increasing their effectiveness. This would oblige equally all companies to take action, and, as a result, strengthen the integrity of the single market. A well-coordinated supervisory system, reinforced at EU level, also ensures a coherent approach applicable to digital services providers operating in all Member States.

134Action at EU level is only partially effective if it is limited to providers established in the EU. This creates a competitive disadvantage vis-à-vis companies established in third countries, which are not subject to any compliance costs in this regard. Furthermore, the effect on the availability of illegal content online is only limited.

135Moreover, due to the interest of companies outside the EU to continue providing its services within the Digital Single Market, the EU can act as a standard-setter for measures to combat illegal content online globally.

4.Objectives: What is to be achieved?

4.1.General objectives

136The general objective of the intervention is to ensure the proper functioning of the single market, in particular in relation to the provision of cross-border digital services.

4.2.Specific objectives

Ensure the best conditions for innovative cross-border digital services to develop

137The first specific objective is to establish the best conditions for the emergence and the scaling-up of intermediaries in Europe, by providing a predictable legal environment across the entire single market effectively addressing the current fragmentation, where the cross-border provision of services is as frictionless as possible and duplication of costs is limited. The aim is to ensure legal clarity and proportionality of obligations accounting for the differences in capability, resources but also impacts and risks raised by small, emerging services compared to very large, established ones.

Maintain a safe online environment, with responsible and accountable behaviour from digital services, and online intermediaries in particular

138This objective is specifically linked to the first set of problems identified: the aim is to provide a framework of incentives and obligations which would facilitate a safe online environment for all citizens, for legitimate expression and for businesses to develop in observance of the rights and values of a democratic society. It aims at providing the legal clarity for online intermediaries, and in particular online platforms, to play their role in ensuring that their services are not misused for illegal activities and that the design of their systems does not lead to societal harms.

Empower users and protect fundamental rights, and freedom of expression in particular

139Closely linked to the second specific objective, a modern online governance needs to place citizens at the centre and ensure that their fundamental rights and consumer rights are promoted. The aim of this objective is to ensure clear and proportionate responsibilities for authorities as well as private companies, to safeguard freedom of expression online by establishing rules that do not inadvertently lead to the removal of information that is protected by the right to freedom of expression and that speech is not stifled or dissuaded online. In particular, this objective seeks to enhance user agency in forming opinion and understanding their informational environment and enhance the protection of other fundamental rights such as the right to an effective remedy and to a fair trial, non-discrimination, protection of personal data and privacy online, rights of the child, etc.

Establish the appropriate supervision of online intermediaries and cooperation between authorities

140None of the other specific objectives can be achieved without appropriate supervision and accountability of services, to ensure trust in the digital environment, and to guarantee online safety and the protection of rights. This necessarily requires some level of transparency of digital services, as well as appropriate capabilities and competence for authorities to supervise. This also requires the best possible cooperation and trust amongst authorities in all EU Member States, ensuring both an effective supervision and creating the best conditions for innovative services to emerge, as per the first specific objective.

4.3.Intervention logic

Figure 5 Intervention logic

5.What are the available policy options?

5.1.What is the baseline from which options are assessed?

141In the baseline scenario, the Commission would not propose any changes to the current legal framework and keep enforcing the E-Commerce Directive. The Commission would monitor the take-up of the Commission’s Recommendation on measures to effectively tackle illegal content online, and the transposition of sector-specific interventions such as the Directive on Copyright in the Digital Single Market, the recently amended Audiovisual Media Services Directive and the Terrorist Content Regulation, once adopted.

142The Commission would also continue to facilitate the coordination of self-regulatory measures targeting some types of illegal activities, such as the dissemination of illegal hate speech, terrorist content, dangerous or counterfeited products, etc. Further action could focus in particular on more self-regulatory actions, which are naturally limited to some services participating on a voluntary basis, and with limitations regarding the enforcement or monitoring of the results. Courts would continue to interpret the obligations of new digital services against the framework of existing EU law as regards the concepts of ‘information society services’ or ’intermediary services’ of the E-Commerce Directive.

143In the absence of further EU legislation, and subject to enforcement of the current legal framework, legal fragmentation in areas not yet subject to sector specific legislation is likely to increase. Already today, a number of Member States, such as Germany, Austria, Denmark or France, have adopted or are in the process of adopting new laws to regulate digital services. A patchwork of national measures would not effectively protect citizens, given the cross-border and international dimension of the issues.

144The proliferation of illegal goods sold online and the dissemination of illegal content would likely continue. At the same time, there would be no harmonised safeguards established for protecting users’ fundamental rights and against over-removal of legal content. Tools for understanding and mitigating cross-sectoral societal concerns and the economic impact of information ‘acceleration’ online would remain limited to incidental and incomplete experiments by researchers and civil society.

145A notable inherent risk of the baseline scenario is the ongoing rapid evolution of the digital environment itself. Companies are setting and enforcing the rules themselves, driven by their commercial interests and not addressing consistently the societal concerns inherent to the digital transformation they are enabling. The ever-growing information asymmetry between online services and their users or the authorities is already making it very difficult to enforce rules online and to supervise the evolving challenges and risks.

146In the baseline scenario, there is no palpable indication that the trend in increased availability of illegal content, goods or services offered online could be curved, in particular where sector-specific legislation is absent. While some platforms will continue to deploy measures according to their own policies, others, in particular smaller players, will continue to be dissuaded by the lack of legal certainty. Further, without harmonised standards on the responsibilities and actions expected from service providers, their approaches will consistently fail to offer a reliable due process standards for users’ rights. This will continue to be a particularly acute issue where very large platforms are concerned, where the information asymmetries and the negotiation disparities with their users are the biggest, and where erroneous decisions are likely the most impactful.

147Barriers for promising European Union companies to scale up in the single market would increase, reinforcing the stronghold of large online platforms, and reducing the competitiveness of the internal market.

5.2. Description of the policy options

148A wider set of options was considered at the scoping phase of the impact assessment, in particular in relation to: the obligations placed on online intermediary services and in particular on online platforms, the liability regime of online intermediaries, their supervision across Member States, and a longer list of issues flagged in the European Parliament’s own initiative reports on the Digital Services Act. Discarded options are presented in more detail in section 5.3 below.

149In addition to the baseline, three packages of options are retained for assessment. They each include a different package of harmonising measures for the due diligence obligations to service providers and a regulatory supervision system appropriate to enforce these measures, as well as updates to the liability regime for online intermediaries. Each of the options is constructed to complement, but not to amend sector-specific legislation, and assumes the continuance and reinforcement of self-regulatory and voluntary efforts compared to the baseline. They all preserve and follow the core principles and provisions of the E-Commerce Directive, including the internal market principle for the supervision of digital services, the approach to the liability exemption for online intermediaries and the prohibition of general monitoring obligations or general obligations on online intermediaries to seek facts and circumstances for illegal activities. Options 2 and 3 make some amendments to the application of the liability regime.

150The three retained options are the following:

1.Limited measures against illegal activities, laying down the procedural obligations for online intermediaries and in particular online platforms, to tackle illegal activities, in order to protect users’ fundamental rights and ensure transparency. It would also enhance the cooperation mechanisms for authorities to resolve cross-border issues related to the supervision of the rules.

2.Fully harmonised measures to incentivise actions from service providers, to enhance transparency and address a wider set of emerging risks by empowering users. Enforcement and cooperation mechanism enhanced with the appointment of a central coordinator in each Member State.

3.Asymmetric measures with stronger obligations for very large online platforms, further clarifications of the liability regime for online intermediaries and an adapted EU governance system to supervise the new obligations on very large online platforms.

Table 2 Summary of options considered in addition to the baseline

Option 1

Limited measures against illegal activities

Option 2

Full harmonisation

Option 3

Asymmetric measures and EU governance

 

Obligations on online intermediaries, in particular online platforms

Due diligence obligations for a fit and proper operation, including notice & action, know your business customer, transparency of content moderation, cooperation with authorities, clear terms and conditions including respect for fundamental rights

Sector-specific interventions through self-regulatory measures

Due diligence obligations, including notice & action, know your business customer, transparency of content moderation, cooperation with authorities, clear terms and conditions including respect for fundamental rights, as well as transparency towards users on advertising

Sector-specific interventions through self-regulatory measures

Due diligence obligations, including notice and action, know your business customer, transparency of content moderation, cooperation with authorities, clear terms and conditions including respect for fundamental rights, as well as transparency towards users on advertising

Enhanced responsibilities for very large online platforms to mitigate systemic risks: e.g. reporting and data access to researchers and regulators, independent systems audits, appointment of a compliance officer, accountability of executive boards, participation in co-regulatory efforts to mitigate emerging risks and report on outcomes

Liability of intermediaries and injunctions

Baseline (rely on case law)

Remove disincentives for services to take action

Harmonise conditions for court and administrative orders for removal of illegal content

Clarifications for new types of services in the Internet stack not clearly fitting in the categories of the E-commerce Directive

Clarification where a service cannot benefit from the liability exemption

Remove disincentives for platforms to take action

Harmonise conditions for court and administrative orders for removal of illegal content and data requests

Supervision

Enhanced administrative cooperation (digital clearing house)

Central 'coordinator' in each Member State

Digital clearing house

Sub-option 3. A: EU Board as an advisory committee formed of representatives of digital services coordinators from Member States. COM powers to apply sanctions

 

Sub-option 3.B: EU Board as a decentralised agency with investigatory and sanctioning powers

Digital clearing house

1.Option 1 – Limited measures against illegal activities

151The first policy option establishes a set of due diligence obligations for tackling illegal activities online, essentially building upon the Recommendation of 2018 and the E-Commerce Directive. The measures apply to any type of illegal activity, as defined in EU and national law. The core elements of the due diligence obligations include the following:

-Notice and Action – Obligation to establish and maintain an easy to use mechanism for notifying any illegal content, goods or services offered through online platforms as well as other hosting services in accordance with harmonised standards. This is coupled with an obligation to inform users if their content is removed, including when the removal follows an assessment against the terms of service of the company, as well as specific actions around repeat offenders. The information obligations are coupled with an obligation to put in place an accessible and effective complaint and redress mechanism supported by the platform and the availability of an external out of court dispute mechanisms.

-Know Your Business Customer (‘KYBC’) – Online platforms that facilitate transactions between traders and consumers have an obligation to collect identification information from traders to dissuade rogue traders from reaching consumers.

-Transparency obligations – Regular transparency reporting on the measures taken against illegal activities and their outcomes, including removal rates, complaints, and reinstatement of content, transparency of the use and functioning of automated tools for content moderation, if applicable.

-Cooperation obligations – Obligations to cooperate with organisations designated as trusted flaggers by applying fast-track procedures for notices.

-Fundamental rights standards in terms of service – This includes the obligation to clearly state in their terms of service any restrictions they may apply in the use of their service, and to enforce these restrictions with due account to fundamental rights.

152Self-regulatory measures through codes of conduct would continue to be encouraged and supported by the Commission and, further measures could be launched, as necessary.

153Concerning the supervision of digital services, this option would build on the cooperation mechanisms established in the E-Commerce Directive and further develop a ‘Digital Clearing House’ to facilitate the exchange of information among Member States and channel requests regarding failures of a given service provider to comply with the applicable requirements. This would cover both information from the country of establishment on sanctions imposed, and requests from authorities in other Member States. Member States can designate one or several authorities competent for supervising the new obligations.

154The scope of the due diligence obligations would be extended to all services targeting the European Union, regardless of their place of establishment. For supervising and enforcing the due diligence obligations, a requirement for a legal representative in the Union would be imposed on services with a significant number of users in one or several Member States.

Stakeholders’ views

There is a strong call for action throughout all categories of stakeholder groups and a consensus that certain responsibilities (i.e. legal obligations) should be imposed on online platforms. For instance, a large majority of stakeholders that answered to the public consultation want all platforms to be transparent about their content policies, support notice and action mechanisms for reporting illegal activities, and request professional users to identify themselves clearly (90%, 85% and 86% respectively).

As regards the nuances between the different stakeholder groups regarding due diligence obligations, the general public, online intermediaries and civil society organisations especially advocated for a harmonisation of notice and action procedures across the EU, and businesses called for the establishment of minimum information requirements for a notice to be actionable. Civil society organizations and the general public stressed the importance of human moderators. Furthermore, most contributions of media and audiovisual associations argued for the need of clear policies against ‘repeat infringers’, and also for the regulation of the notion of ‘trusted flaggers’. The majority of retail associations highlighted the need for platforms to inform consumers who have previously bought an illegal or dangerous product that they have been exposed to illegal goods. Concerning online marketplaces, many stakeholder groups flagged the need to verify the sellers in order to provide transparency to consumers, and to increase the efficiency of enforcement. Especially rights holders and brands stated that they are incurring considerable costs by having to identify fake listings, and for reporting the sale of counterfeit goods or other illicit products to platforms.

Regarding supervision, 85% of respondents who replied to the relevant question in the OPC on the DSA package (2020), think that online platforms cannot be trusted to sufficiently guarantee democratic integrity, pluralism, non-discrimination, tolerance, justice, solidarity and gender equality. A large majority of stakeholder groups called for improved cooperation between authorities in different Member States and highlighted the importance of data sharing with law enforcement authorities, in particular for rogue traders. Member States flagged the challenges of consumer protection authorities regarding the effective enforcement to tackle illegal or shady business practices, and point to a low level of awareness among the enforcement bodies and a lack of harmonization of EU law. Some Member States further called for the current cooperation system to be revised and strengthened in order to avoid a fragmentation of the European digital market. They stated, that the emerging patchwork of EU and national legislation makes it even more challenging for enforcers to oversee the European market. Online intermediaries emphasized the importance of coordination between national authorities and between all actors involved in the ecosystem.

2.Full harmonisation

155This option would include the same due diligence obligations as those foreseen in option 1.

156In addition, this option would impose on online platforms further transparency obligations towards their users, specifically regarding advertising systems – modernised transparency obligations covering all types of advertising (all ads placed on online platforms, not just commercial communications, but also e.g. issues-based or political advertising). Such measures would include enhanced information to users distinguishing the ad from ‘organic’ content, information about who has placed the ad and information on why they are seeing the ad (depending on the type of advertising – e.g. targeted, contextual - and, if applicable, targeting information).

157This option harmonises certain conditions for cross-border court or administrative orders to impose measures for the removal of illegal content, goods or services by intermediaries.

158In this option as well, self-regulatory measures through codes of conduct would continue to be encouraged and supported by the Commission and, further measures could be launched, as necessary.

159Concerning the liability of intermediaries, option 2 would adapt the existing legal framework to remove disincentives for services, in particular online platforms, to take voluntary measures to address illegal activities: the intervention would clarify that such measures do not, in themselves, remove intermediaries from the scope of the liability exemptions.

160Regarding the supervision of digital services, this option would complement the first option’s Digital Clearing House by requiring Member States to designate a supervisory authority as a central Digital Coordinator. The Digital Coordinator would be tasked with facilitating coherence of the supervision and enforcement across different authorities in the relevant Member State, not least as regards capabilities and supervision of the additional obligations related to algorithmic systems in recommender and advertising systems, as well as with ensuring the cooperation interface for smoother cross-border assistance through the Digital Clearing House.

Stakeholders’ views

In addition to the broad convergence around the core due diligence obligations also presented in option 1, a variety of stakeholder groups voiced concerns around online advertising, more specifically the lack of user empowerment, especially as regards deceptive advertisements, and lack of meaningful oversight and enforcement. Users demanded that reporting deceptive advertisements should be facilitated, both, when the advertisement is encountered online, and after the fraud was discovered by the user. The most frequent issues pointed to as necessary relate to more transparency regarding the identity of the advertiser, how the advertisements are personalized and targeted, and to the actions taken by ad intermediaries to minimize the diffusion of illegal ads and activities. Implementing features that explain why certain advertisements are shown to users were considered a good practice to build upon and to empower users.

Some intermediaries, academic institutions, and civil society organizations stated that the current liability regime creates disincentives to act, and called for clarification to stimulate voluntarily preventative measures to detect illegal content. Especially the absence of incentives are seen as counter-productive in the fight against illegal activities online. Start-ups strongly supported the removal of disincentives for voluntary measures and stressed that this would be a very important safeguard for smaller online platforms and would incentivize businesses to take voluntary actions. Start-ups converge on the opinion that illegal content should be tackled by all online platforms regardless of their capacity, whereas harmful content should not fall under this regime. They are proponents of making all platforms introduce clear terms and conditions, and to develop best practices. Consumer organisations strongly called for a special liability regime for online market places to make them directly or jointly liable in case they exercise a predominant influence over third parties or in case the platform fails to properly inform consumers or fails to remove illegal goods or misleading information (assessed in Annex 9).

Online intermediaries generally considered that any new measure should avoid being overly prescriptive as regards the use of specific tools or technologies in the context of content moderation. In this context, stakeholders, especially civil society and digital rights associations, warned against monitoring requirements and the use of automated tools for tackling illegal or harmful content, goods and services due to significant risks to citizens’ fundamental rights, right to privacy, and freedom of expression. 82% of all stakeholders that answered to the relevant question in the public consultation, support high accuracy and diligent control mechanisms, including human oversight when automated tools are deployed for detecting content or accounts. Start-ups, scale-ups and smaller platforms pointed out that automated tools are also very costly to develop and maintain, and see this as a significant barrier to market entry.

Member States stated that while the cooperation between authorities and larger service providers has provided for some good results, a more formal regulatory framework and an update to the current legislation is desired. Many Member States pointed to the risks related to the inability to provide effective surveillance and enforcement on the global digital services environment. Member States also warned that the digital single market should not be overregulated. Medium-sized and smaller companies, as well as business associations, flagged the fragmented state of the digital single market as a burden in providing digital services, especially when expanding to one or more Member States. Mentioned were especially the requirement to have a legal representative or establishment in more than one Member State, and the different procedures and points of contact for obligations to cooperate with authorities. When asked what governance arrangements would lead to an effective system for supervising and enforcing rules on online platforms in the EU in particular as regards the intermediation of third party goods, services and content, 81% of stakeholders called for a cooperation mechanism within Member States across different competent authorities responsible for systematic supervision on online platforms and sectorial issues. 80% of respondents stated that a cooperation mechanism would need to have swift procedures and assistance across national competent authorities across Member States. National authorities are also in favor of a reinforced cooperation mechanisms, but some call for assessing the effectiveness of a European agency. Some civil society organisations emphasized that robust and effective enforcement mechanisms for regulatory oversight are absolutely necessary, in particular to foster coordination between national authorities and to address issues with lack of transparency and inconsistencies within procedures.

3.Asymmetric measures and EU governance

161The third option retains all the components of Option 2, but includes an asymmetric regime, targeting those very large platforms where the biggest audiences are reached – and, potentially, the most severe harms are caused.

162Very large platforms represent the highest societal and economic risks because of their significant reach among citizens and traders in the EU. Therefore, the definition of very large platforms in this option is based on the number of users, as a direct and objective proxy for their reach and potential impact. The threshold is set at 45 million monthly users from the EU, the equivalent of 10% of the EU population. Available data shows that the largest online platforms captured by this threshold correspond to the services considered by stakeholders and academics to represent the highest societal and economic risks, and typically have a pan-European presence. Also, the providers of these online platforms generally have a high turnover and/or market capitalisation value. The platforms’ reporting obligations and the Digital Services Coordinators’ enquiring powers will ensure that the data on the number of users is available to enforce the enhanced obligations. As explained in more details in Annex 4, other alternative and cumulative criteria have also been assessed and discarded for the purposes of the definition of very large online platforms.

163The additional set of enhanced obligations on very large online platforms reaching a significant number of Europeans are designed proportionately to the systemic impacts and risks these large platforms represent for society and the business environment, as well as to their capacities. Their due diligence obligations are obligations of means, without an expectation of no-fault results. These enhanced obligations are necessary to secure compliance with the rules ensuring the safety of citizens and the prevention of deceptive and fraudulent commercial practices online. This includes:

-obligations to maintain a risk management system, including annual risk assessments for determining how the design of their service, including their algorithmic processes, as well as the use (and misuse) of their service contribute or amplify the most prominent societal risks posed by online platforms. An obligation to take proportionate and reasonable measures to mitigate the detected risks follows, and the risk management system is regularly submitted to an independent audit;

-enhanced transparency and reporting obligations with regard to content moderation, content amplification and online advertising activities at the request of competent supervisory authorities;

-user-facing transparency of content recommender systems, enabling users to understand why, and influence how information is being presented to them;

-obligations to ensure access to data for researchers for investigations into the evolution of risks;

-maintenance and broad access to ad archives;

-a renewed co-regulatory framework with participation in adaptive and responsive codes of conduct to mitigate emerging risks coupled with an obligation to ensure reporting on outcomes and participation in crisis management protocols, responding to extraordinary situations where risks manifest online.

164Option 3 includes, apart from the removal of disincentives to voluntary actions, as per Option 2, further clarifications of the liability regime for online intermediaries to ensure legal certainty and innovation. It addresses the fragmentation stemming from the different national approaches to the liability exemptions and preserves the principle of conditional liability exemption. To afford more legal clarity in grey areas of interpretation concerning online platforms, it also specifies the conditions under which such services are truly intermediaries.

165Similar to option 2, option 3 harmonises certain conditions for cross-border court or administrative orders to impose measures for the removal of illegal content, goods or services by intermediaries. Additionally, it includes obligations to notify of suspicions of criminal offences, where an intermediary service becomes aware of any information 110 . Furthermore, it also includes conditions for cross-border orders for competent authorities to access data necessary for supervising underlying services intermediated by online platforms.

166For the supervision of the obligations, next to the Digital Clearing House and the national Digital Coordinators, an EU Board, including the participation of the national Digital Services Coordinators, enhances the governance system, particularly necessary for ensuring the supervised risk management approach for very large platforms. This system ensures in particular that systemic problems brought by those platforms with an EU-wide impact are appropriately addressed through EU supervision, with sufficient expertise and appropriate competencies, which is based on the clarified powers of the host Member State and rules for cooperation with the other Member States. It ensures appropriate assistance from other Member States and the Commission to the Member States in charge of supervising very large platforms. Building on the enhanced data access obligations for very large platforms, this includes in particular the technical assistance for complex investigations related to algorithmic systems or language-specific issues. This system also provides for fast information channels for all Member States where the effects of platforms’ content moderation decisions are felt. Further, it provides for an escalation system where the Commission can supervise and take direct enforcement measures against very large platforms. Sanctions applied would be proportionate to the severity of the systemic non-compliance.

167This options considers two approaches, distinguishing the legal form of the EU Board:

-Sub-option 3.A: the EU Board is established as an ad hoc independent advisory group, advising Digital Services Coordinators and the Commission on supervisory and enforcement issues, including those related to very large platforms (inherently present across Member States).

-Sub-option 3.B: the EU Board is established as an EU body with legal personality, supported by a secretariat and, in addition to the powers under sub-option 3.A, it can also adopt binding decisions. 

Stakeholders’ views

Whilst there is a general call, especially among citizens, for establishing as much transparency as possible, most stakeholder groups, especially business organisations and start-ups stated that not all types of legal obligations should be put on all types of platforms. Press publishers, for example, state that due diligence obligations should only concern very large online platforms, and should not cover hosting services such as comments sections on newspapers’ websites. A majority of stakeholder groups, including business associations, academic institutions and the general public, recognized that not all platforms should be required by law to cooperate with national authorities, but that platforms at particular risk of exposure to illegal activities by their users should maintain a system for assessing the risk of exposure to illegal content or goods and be required to systematically respond to requests from law enforcement in accordance with clear procedures as well as employ appropriately trained and resourced content moderation teams. 72% of respondents to the relevant question in the public consultation consider both, independent system audits and risk assessments as essential, especially when it comes to countering the spread of disinformation, as well as reporting and data access to researchers and regulators. How algorithmic systems shape online content is an area of concern among a wide category of stakeholders. Several stakeholders, amongst them citizens, civil rights organizations, academic institutions as well as media companies and telecommunication operators pointed out the need for algorithmic accountability and transparency audits on very large platforms, especially with regards to how content is prioritized and targeted. Users should receive more information and have more control over the content they interact with and digital rights associations think they should be able to opt out of micro-targeting and algorithmically curated content.

Academic institutions pointed to persistent difficulties when conducting research, and explained the difficulty of observing emerging issues and phenomena online, blaming an inconsistent access to relevant data. Some pointed to the need for a generally disclosed ad archive, as well as an independent auditing of ad systems. According to start-ups and SMEs, limiting some obligations to large players would ensure that the legal obligations are targeted to where problems actually occur. Start-ups especially stressed the point that a ‘one-size-fits-all’ approach would be most beneficial for very large platforms, but could have detrimental effects on medium-sized or smaller platforms and businesses at the core of the European digital ecosystem. They stress that their growth and evolution should not be hindered by disproportionate rules that impede on the successful development of competing alternative services and business models. Online intermediaries acknowledged the possibility of more transparency, but warned against possible implications of far-reaching measures in terms of compromising commercially-sensitive information, violations of privacy or data disclosure laws, and abuse from actors that could game their systems. Some online intermediaries considered that a transparency obligation could be best achieved by establishing a requirement for regular reporting.

Start-ups, telecommunication operators and several other stakeholders, notably new types of services in the internet stack, such as cloud services, CDN and DNS services, as well as other technical infrastructure providers, called for clarifications in the liability regime of intermediaries, without challenging its basic principles. Small companies in particular deplored the lack of legal predictability with regard to voluntary measures they might take. They also called for due diligence obligations on hosting service providers, and proportionate and targeted measures.

An effective EU oversight is considered essential for the compliance with the due diligence obligations by most stakeholder groups, especially telecommunications operators. Many stakeholder groups, but especially business associations and companies, considered, that the degree of oversight should vary depending on the services’ obligations and related risks. The majority of stakeholders groups favoured a unified oversight entity to enforce rules on digital service providers (66% of the respondents to the relevant question in the public consultation. Authorities called for a coordination and technical assistance at EU-level for supervising and enforcing rules on online platforms as regards the intermediation of third-party goods, services and content.

Especially in the context of addressing the spread of disinformation online, regulatory oversight and auditing competence over platforms’ actions and risk assessments was considered as crucial (76% of all stakeholders responding to the relevant question in the public consultation. Academic institutions as well as civil society organizations showed concerns about the lack of adequate financial and human resources in competent authorities tasked with supervision of digital services. Many groups of stakeholders, especially civil society organizations defending digital rights, identified the need for interdisciplinary skills in a new oversight entity, particularly in-depth technical skills, including data processing and auditing capacities, which would allow for the reliable and thorough assessment of algorithmic abuses. The majority of academia and civil society organizations defending fundamental rights consulted, emphasized the need for strong, proportionate and foreseeable enforcement to hold platforms to their promises and are in favour of a supervised regulator or authority, to reconcile opposing needs and potentially sanction repeated failures.

5.3.Options discarded at an early stage

168The options selection followed a funnelling methodology, exploring the widest spectrum of approaches. Several types of options were discarded earlier in this process, as explained below. In several cases, the assessment of the impacts on fundamental rights led to the discarding of these options because they did not ensure a fair balance in mitigating the risks.

169Continuing to only regulate on a sector-specific approach, as done e.g. for content infringing copyright, terrorist content, explosive precursors, audiovisual content: Such approaches are important in addressing targeted issues in specific sectors or in regards to specific content. They are however limited in their ability to address the systemic, horizontal problems identified in the single market for digital services and would not address comprehensively the risks and due process challenges raised by today’s online governance. Ultimately, this option was discarded for four main reasons: (i) the E-Commerce Directive is horizontal in nature and its revision requires a horizontal approach; (ii) the identified risks and problems are systemic and lead to cross-sectoral societal concerns; (iii) sector-specific legislation can lead to inconsistencies and uncertainties; and (iv) only horizontal rules ensure that all types of services and all categories of illegal content are covered.

170Fundamental changes to the approach on the liability regime for online intermediaries: Annex 9 presents a series of considerations for different theoretical models of liability for intermediaries. The liability exemption of online intermediaries is a cornerstone for the fair balance of rights in the online world 111 . Any other model placing more legal risks on intermediaries would potentially lead to severe repercussions for citizens’ freedom of expression online and traders’ ability to conduct their businesses online and reach consumers. They would also be prohibitive for any new business, reinforcing the stronghold of very large players, able to sustain and, to a certain extent, externalize costs. Conversely, options significantly decreasing the standard for hosting services to quality for the liability exemption would severely affect the safety and trust in the online environment.

171Changes to the single market principle set in the E-Commerce Directive and the requirement for the country of establishment to supervise services would inherently undermine the development of digital services in Europe, allowing only the very large players to scale across the single market. The single market principle is also the optimum model for ensuring that rules can effectively be enforced against services. The evaluation of the E-Commerce Directive and all other available evidence shows that that the single market principle has been instrumental for the development of digital services in Europe. This principle increased legal certainty and reduced compliance costs significantly, which is crucial for smaller services in particular.

172Change to the prohibition on general monitoring obligations: the provision is core to the balance of fundamental rights in the online world. It ensures that Member States do not impose general obligations which could disproportionately limit users’ freedom of expression and freedom to receive information, or could disproportionately burden service providers excessively, and thus unduly interfere with their freedom to conduct a business. It also limits online surveillance and has positive implications in the protection of personal data and privacy. Allowing such a disproportionate burden would likely lead to numerous erroneous removals and breaches of personal data, resulting in extensive litigation. Options for changes to general monitoring obligations were considered, and then discarded for non-compliance with the balance of rights described here.

173Laying down prescriptive rules on content which could potentially be harmful to certain audiences, but which is not, in itself, illegal. The Impact Assessment focuses on illegal information and activities and on processes, tools and behaviours which might create or increase harms (i.e. recommender systems and other design choices for accelerating and selecting information flows). It is understood that content which is not illegal cannot be subject to the same removal obligations as illegal content.

174Alternative options for the governance structure:

-An expert group including Digital Services Coordinators, managed by the Commission: this would at most ensure a limited, and less structured information sharing among national authorities.

-Assigning the competences to an existing body. Following an initial screening of the competences, capabilities and mission of the BEREC Office, ENISA, the Cybersecurity Competence Centre, EDPB, EDPS, Europol, no appropriate synergies were identified.

6.What are the impacts of the policy options?

175The policy options were evaluated against the following economic and societal impacts, with a particular focus on impacts on fundamental rights.

Table 3 Summary of impacts for each option considered (compared to the baseline)

Impacts assessed

Baseline

Option 1

Option 2

Option 3

Economic impacts

Functioning of the Internal Market and competition

~

+

++

+++

Costs and administrative burdens on digital services

~

>

>>

>> 112 / >>> 113

Competitiveness, innovation, and investment

~

+

++

+++

Costs for public authorities

~

>

>>

>>>

Trade, third countries and international relations

~

+

+

+

Social impacts

Online safety

~

+

++

+++

Enforcement and supervision by authorities

~

+

++

+++

Fundamental and rights (as laid down in the EU Charter)

Freedom of expression (Art 11)

~

+

++

+++

Non-discrimination, equality, dignity (Art 21, 23,1)

~

+

++

+++

Private life and privacy of communications (Art 7 )

~

+

+

++

Personal data protection (Article 8)

~

~

~

~

Rights of the child (Art 24)

~

+

++

+++

Right to property (Art 17)

~

+

+

+

Freedom to conduct a business (Art 16)

~

+

+

+

User redress

~

+

++

++

Overall

~

+

++

+++

6.1.Economic impacts

Functioning of the internal market and competition

176All options considered would have an overall positive effect on the functioning of the single market, but there are notable differences between options. They all follow and build on the cornerstone of the single market for all digital services set in the E-Commerce Directive, and reinforce to varying extents the cooperation of national authorities in supervising the new obligations considered in each option.

177The first option would in particular support the access to the single market for European Union platform service providers and their ability to scale-up by reducing costs related to the legal fragmentation rapidly escalating across Member States (as per section 6.1.2 below). It would also improve legal clarity and predictability by increasing transparency about content moderation measures and business users of online platforms through harmonized rules, as well as improve the cooperation between Member States in addressing cross-border issues.

178The second and the third option would in addition further establish trust across Member States through an agile cooperation mechanism for cross-border concerns. This would add legal predictability for intermediary services active in several Member States. Importantly, it would also facilitate the effective enforcement of rules throughout the single market, where the Member State of establishment is most generally easily able to coerce a service provider, if need be, but other Member States would equally have an effective channel for making sure that the particular challenges of their state are appropriately addressed.

179In addition, the third option would couple the cooperation mechanism with an EU level body, allowing for fully coordinated actions and addressing in a consistent and more efficient way issues common to several Member States. Both sub-options address this and provide for different mechanisms for ensuring an EU level enforcement or rules when very large platforms are concerned, ensuring maximum consistency across the single market.

180In a model reflecting only the concerns related to legal fragmentation, addressed in all three options, the legal harmonisation of obligations across the single market should lead to an increase of cross-border digital trade of 1 to 1.8% 114 . This is estimated to be the equivalent of an increase in turnover generated cross-border of EUR 8.6 billion and up to EUR 15.5 billion 115 .

181With regard to effects on competition, all three options are proportionate and do not impose dissuasive requirements for service providers. By harmonising the legal requirements across Member States, they all establish a level playing field for emerging services across the single market. The third option would establish asymmetric obligations on very large online platforms with a systemic impact in Europe, making sure that smaller, emerging competitors are not affected by disproportionate obligations, while ensuring that certain systemic policy concerns are adequately addressed by very large online platforms. The asymmetric obligations would lead to higher costs for approximately 20 of the largest platforms in the EU and the world, both in terms of users reached and turnover. These enhanced obligations are necessary to secure online safety and fight against illegal activities efficiently. They would also lead to significant improvements of the service itself, not least in enduring a safer environment for their users and the respect of their fundamental rights; in turn, this will likely benefit the platform itself when compliant with the requirements. However, smaller companies could also take similar measures on a voluntary basis, and would be invited to be part of the co-regulatory framework (e.g. on content moderation and crisis management, on advertising transparency).

Competitiveness, innovation and investment

182With the additional legal certainty, all three options are expected to have a positive impact on competitiveness, innovation and investment in digital services, in particular European Union start-ups and scale-ups proposing platform business models but also, to varying extents, on sectors underpinned and amplified by digital commerce. The legal certainty provided by the intervention would likely encourage investments in European Union companies.

183The first option would primarily affect online intermediaries established in Europe by cutting the costs of the evolving legal fragmentation and allowing services to repurpose resources in growing their business and, potentially, investing in innovative solutions. It would in addition create a true regulatory level playing field between European Union-based companies and those targeting the single market without being established in the EU.

184The second and third options would bring stronger improvements to the cooperation mechanisms across Member States and harmonise a wider spectrum of provisions, including transparency requirements in online advertising. They would thus affect a wider spectrum of digital services and limit current and emerging costs of legal fragmentation, compared to the first option and the baseline scenario.

185Further, all three options would preserve the equilibrium set through the conditional liability exemption for online intermediaries, ensuring that online platforms are not disproportionately incentivised to adopt a risk-averse strategy and impose too restrictive measures against their business users (and citizens using their services). This is particularly sensitive in the recovery phase of the COVID-19 crisis, where sectors such as tourism, accommodation, food and transport require predictability and a reinforcement of their online presence.

186Overall, the three options will lead to better conditions for the underlying digital services will result in more choice for both businesses and consumers. This will cascade into increase in e-commerce, in particular cross-border 116 , including positive impacts on the creative industry, manufacturing, information service and software, etc. Consequently, all three options will have a positive effect on the competitiveness of legitimate business users of online platforms, manufacturers or brand owners, by reducing the availability of illegal offerings such as illegal products or services (and, of course, reducing harms on consumers, as per 6.2.1 below). In addition, the legally guaranteed availability of the internal and external complaint and redress mechanisms would afford other better protections against erroneous removal and limit losses for legitimate businesses and entrepreneurs.

187The macroeconomic expected impact of Option 1, once fully implemented, amounts to an increase of 0.3% of GDP benchmarked against 2019 values – i.e. a total of EUR 38.6 billion. 117

188Options 2 and 3 would reach better results by removing disincentives for platforms established in the Union to take appropriate voluntary measures, in both ensuring a higher scale of illegal activities and information online, and in safeguarding users’ rights. The macroeconomic impacts of Option 2 are estimated at a 0.4 increase of GDP (EUR 61.8 billion) 118 .

189Option 3 would produce better results than option 2, in applying asymmetric obligations to the largest online platforms where a large share of the economic loss occurs. A risk management approach would address in a targeted manner areas of abuse and systemic failures. It would in addition afford enhanced transparency on key processes related to the prioritisation of information which reaches consumers through online advertising and recommender systems. This would build further resilience into the system, giving more choice and agency to users and stimulating an innovative and competitive environment online. The macroeconomic impacts of Option 3 are estimated at a 0.6% increase of GDP (EUR 81.7 billion). An alternative model 119 following a different methodology estimates a EUR 76 billion increase in EU GDP over the 2020-2030 period for a package of measures broadly equivalent to Option 3.

Costs and administrative burdens on digital services

190All three options incur costs for online intermediaries. However, these costs represent a significant reduction compared to those incurred under the present and evolving fragmented and uncertain corpus of rules.

191At company level, in a simple model quantifying only the harmonising rules common to all three options, the legal intervention would already close the Single Market gap with a cost reduction of around EUR 400.000 per annum for a medium enterprise assumed present in three Member States. Compared to projected scenarios where the legal fragmentation would become more acute, the intervention would lead to savings of EUR 4 million for the same scale of company present in 10 Member States and EUR 11 million for an extreme scenario of fragmented rules in each of the 27 Member States of the Union. The cost savings are most impactful for micro- and small- enterprises, where the current fragmentation is prohibitive for offering services in more than two Member States. 120  

192Direct costs for the main due diligence obligations are common across all three options and depend to a large extent on the number of notices and counter-notices received by a platform and cases escalated to an out of court alternative dispute resolution system. Estimates are considered based on an initial scale of notices received, but can vary to a large extent.

193For out of court alternative dispute resolution systems, there is a level of uncertainty, as no reliable data or precedent allows to estimate what volumes of complaints would be escalated. The existence of alternative dispute resolution mechanisms in all Member States would however facilitate access to such mechanisms and likely append negligible costs compared to the current system.

194In addition to these costs, for the second option, services would also incur some technical design, maintenance and reporting costs, for the additional information and transparency obligations presented in paragraph 156. However, these are expected to be marginal and absorbed into the general operations and design costs of online platforms and ad intermediaries, respectively. As these measures are intimately related to the type of service offered and design choices of the service itself (e.g. development and use of recommender systems, ad intermediation, marketplaces intermediating services and sale of goods), micro-enterprises would not be exempted from scope.

195In option 2, costs related to information requirements would equally be reduced rather than increased, compared to the baseline, by streamlining and harmonising the requirements, thereby preventing further legal fragmentation and possible compliance requirements with very divergent national systems.

196In addition, the third option includes a series of potentially significant costs which are limited to very large online platforms. First, the enhanced transparency and reporting obligations for content moderation, recommender systems and online advertising would bring technical and maintenance costs which would be absorbed in the services’ operations. A fixed cost for the organisation of risk assessments and annual audits would also be incurred by very large platforms. Risk mitigation measures will, however, vary to a large extent depending on the initial design of the systems, and the severity of risks. Overall the additional costs for such service providers range from EUR 300.000 to EUR 3.500.000 per annum for the additional obligations, excluding potential variable costs for risk mitigation measures.

197The table below presents an overview of the cost estimates at company level for each option.

Table 4 Estimates of costs at company level 121

Type of obligation

Option 1

Option 2

Option 3

1.Notice and action obligations and information to users, absorbing complaints and redress costs

For all hosting service providers:

Highly dependent on the volume of notices received, where personnel costs are the most notable expenditures. Estimates range from a one-off maximum cost of EUR 15.000 for establishing a notice-and-action technical system and light maintenance, to EUR 1 million for a volume of 200 notices received per day, and EUR million for 3000 notices received per day. While there are some economies of scale with the increase of the number of notices, these are limited. These are indicative costs and, for most companies, they do not represent an additional cost compared to current operations, but require a process adaptation in the receipt and processing of notices and streamline costs stemming from fragmented obligations currently applicable.

2.Legal representative

For some online intermediary services not established in the EU and with a significant user base in the EU:

Estimated between EUR 50.000 to EUR 550.000 per annum, depending on the FTE necessary to complete the tasks. These costs can be partially or fully absorbed, for most companies, in existing requirements for legal representatives.

3.Transparency reporting

For all intermediary services (exempting small and micro-enterprises):

0.1 and up to 2 FTEs and one-off development data collection, absorbed in the development of systems

4.User-facing transparency of advertising and recommender systems

\

Online platforms (exempting small and micro enterprises)

Costs absorbed in the routine development of systems

Data collection and availability as regards information on the functioning and targeting criteria, when applicable, by and large absorbed into GDPR compliance, with minor additional costs for up-front information publication

5.Risk management obligations

\

\

Very large platforms:

Risk assessments: estimated between EUR 40.000 and EUR 86.000 per annum

Audits: between EUR 55.000 and 545.000 EUR per annum

Risk mitigation measures are variable costs and can range from virtually no costs, to significant amounts, in particular when the platforms’ systems are themselves causing and exacerbating severe negative impacts. The duration and level of expenditure for such measures will also vary in time. Similarly, participation in Codes of conduct and crisis protocols requires attendance of regular meetings, as a direct cost, but the streamlined targeted measures can vary.

6.Ad archives

\

\

Very large platforms (that run advertisements on their platforms):

Up to 220.000 EUR for building APIs to give access to data and quality controls for data completeness, accuracy and integrity, and for system security and availability.

7.Compliance officer

\

\

Very large platforms

Estimated between 1-5FTEs

SME test

198For a micro-enterprise, the costs of the legal fragmentation seem prohibitive today: the modelled costs when providing services cross-border are higher than the maximum annual turnover of a micro-enterprise when offering services in several Member States. The harmonised rules under all options would cut duplication costs for SMEs as well as costs from legal risks with regards to the harmonised provisions for each option.

199With regard to SMEs offering platform services, since they do not reach a scale in their user base equivalent to that of very large online platforms, the illegal activities conducted by their users would not reach a similar impact. There are exceptions to this. First, the user base of successful online platforms typically scales very fast; second, even the smaller services can be instrumental to the spread of certain crimes online. On some occasions 122 , microenterprises can be aggressively targeted by perpetrators, not only leading to societal harm, but also corrupting the legitimate value proposition of the digital service. Consequently, SMEs cannot be fully exempted from the minimum requirements for establishing and maintaining a notice and action mechanism under each of the options.

200Costs of the notice and action system are proportionate to the risks posed by each service: an average micro-enterprise receiving a volume of 50 notices per annum, out of which 5% would make the object of a counter-notice procedure, should sustain a cost of approximately EUR 15 000 per annum. The introduction of standard, minimum requirements for notices, procedures and conditions, as well as reporting templates, should further decrease the expected costs for small companies, supporting them in tackling illegal content and increasing in turn the legal certainty.

201Should the volume of notices increase exponentially, this would likely correspond to a generalised exploitation of the service for illegal activities. The costs for processing the notices could become prohibitive, but, conversely, a non-responsive service would likely bear legal consequences even under the baseline scenario, and would lose its legitimate users. A notice-and-action system can be a powerful support for legitimate businesses who intend to address illegal activities carried out by their users.

202Under all options, the additional transparency obligations are expected to be proportionate to the risks and capacity of each service provider and should be absorbed in the operations and design of the systems. However, these costs could be in themselves disproportionate for a small or micro-enterprise and the risks such companies pose, and the impacts they may have do not justify such limitations on the companies.

203Option 3 specifically includes targeted obligations for very large platforms. These are not expected to be SMEs under any circumstance, as both the number of employees and the global turnover of such platforms is significantly higher than those of a medium-sized enterprise. However, thresholds for ‘very large platforms’ would be set proportionately to their reach in terms of number of users in the Union, but would not exempt SMEs by virtue of the risks and societal harms such services could cause.

Costs for public authorities

204The supervision and enforcement of the rules would be key in ensuring the success of the intervention. An appropriate level of technical capability within public authorities, robustly built over time, will ensure a correction of information asymmetries between authorities and digital services and the relevance of the public intervention in a sustainable model of online governance. From this perspective, any additional measures to mutualise resources and expertise, and establish sound IT infrastructures for cooperation can have a net positive effect in assisting all Member States in the medium- to long-term.

205Compared to the baseline, each of the three options should significantly cut the costs brought by the inefficiencies and duplication in the existing set-up for the cooperation of authorities (see driver 2.3.6). With regard to law enforcement, a smoother, more reliable cooperation with digital services, not least in processing requests, would improve the effectiveness and efficiency of their actions. Net cost reductions, however, may not be expected, since the volume of illegal activities online is far larger than the capacity of law enforcement authorities to investigate these offences.

206With the first option, national authorities would follow a clear, streamlined process for cross-border issues, with clear resolution and response. Member States where a large number of services are established are likely to need some reinforcements of capabilities. These will be attenuated, however, through the creation and use of a clearing house system for cooperation across authorities, including technical costs for the development and maintenance (by the Commission), as well as running costs for the Member States’ appointed authorities to engage in the cooperation, either to issue or to respond to requests. Information flows and data collected through the clearing house should significantly improve the ability of Member States to supervise the systemic compliance of services with the requirements.

207For the second option, a digital coordinator would need to be appointed in each Member State, interfacing with the other EU authorities and assuming a coordination role among the competent authorities in their country. While the coordinator would require some costs, the efficiency gains are expected to outweigh them in every Member State: efficiency gains for the individual authorities through mutualisation of resources, better information flows, and straight-forward processes for interacting with their counterparts across the single market, as well as with service providers.

208For the third option, an additional cost would be born at EU level, creating further efficiency gains in the cooperation across Member States and mutualising some resources for technical assistance at EU level, for inspecting and auditing content moderation systems, recommender systems and online advertising on very large online platforms.

Table 5 Summary of costs for authorities for each option considered

Type of activity

Option 1

Option 2

Option 3

1.Supervising systemic compliance with due diligence obligations for all services (country of establishment)

Cost efficiencies: streamline evidence and information for supervising platforms through the clearing house system.

Direct costs: varying from 0.5 FTEs up to 25 FTEs, depending on scale of services hosted 123

2.Supervision of enhanced obligations for online platforms – expenditures at MS level

\

\

Significant cost efficiencies through enhanced transparency obligations on platforms

Costs expected to fluctuate depending on inspections launched. For one inspection/audit, estimates between EUR 50.000 and EUR 300.000. 124  

Codes of conduct and co-regulatory framework: investment at EU level of 0.5-2 FTEs per initiative – absorbed in costs in section 3 below

3.Supervision and governance at EU level

\

\

Sub-option 3.A:

-European Commission : 50 FTEs + EUR 25 mil operational budget

-Member States : 0.5 - 1 FTE for participation in the Board

Sub-option 3.B:

-EU Board decentralised agency 55 FTEs (operations and admin) + EUR 20 mil operational budget

-European Commission: 10 FTEs + EUR 10 mil. operational budget Member States : 0.5 - 1 FTE for participation in the Board

-Member States : 0.5 - 1 FTE for participation in the Board

4. EU-level: for clearing house and coordination

Significant cost efficiencies expected from smoother, clearer cooperation processes

One-off: 2 mil per annum over the first two years for technical development. Maintenance and additional development over the next 3 years of approx. EUR 500.000 in total

5.Law enforcement actions & public authorities requests (re. supervision of illegal activities online)

Cost efficiencies: streamline cooperation processes for cross-border assistance; clear process for information requests to digital services and information obligations

Direct costs: no direct costs entailed by the measures, but no net reduction of costs expected, as volumes of illegal activities consistently higher than law enforcement capacities

Trade, third countries and international relations

209All three options are expected to have an impact in diminishing illegal trade into the Union, both in relation to direct sellers and sellers intermediated by online platforms.

210All options would require a legal representative in the Union and extend the scope of the due diligence obligations to service providers established outside the EU thereby ensuring EU users’ rights are protected in the global online space. This is not expected to have a significant effect on legitimate platforms from third countries targeting the single market, with further proportionality incorporated by excluding very small, incidental providers. For most platforms, it is likely that legal representatives are already established as part of other legal requirements under EU legislation (e.g. General Data Protection Regulation 125 (‘GDPR’), which would absorb to a large extent this cost. In addition, compliance with EU rules could be a commercially beneficial trust signal for such providers.

211The intervention would inherently set a European Union standard in the governance of issues emerging on online platforms, both in relation to measures to mitigate risks and ensure online safety, and the protection of fundamental rights in the evolving online space. Most international fora, including the G7 and the G20, but also international organisations such as the UN, the OECD and the Council of Europe have flagged such concerns, and other jurisdictions have taken measures or are currently discussing taking measures – including amongst others the US, Australia, Canada, India, New Zealand. Action in this field by the European Union will lead to enhanced cooperation and engagement with third countries in this context.

212The first option, more limited in the scope of measures, would still set a regulatory standard in particular on the due process and information requirements from platforms, encouraging a fundamental rights-centric approach. The second option would in addition firmly clarify the balance of rights set through the liability regime for online intermediaries, a controversial and politicised topic in some other jurisdictions, and would set a higher standard of transparency and accountability for online platforms. The third option would place the European Union in a leadership role, not least through establishing an EU-level body supporting the oversight of the largest, most impactful platforms, and establishing an important capability for auditing and investigating such platforms in a flexible manner, in anticipation of emerging risks.

213From an international trade perspective, the provisions are in line with the non-discrimination provisions in the GATS, as they follow objective and non-discriminatory criteria, regardless of the location of the headquarters or the country where the company historically originated. This is also the case in establishing whether, in option 3, service providers fall into the category of ‘very large online platforms’ as the scope is defined exclusively by the objective metric of number of users.

6.2.Social impacts

Online safety

214As a primary objective for the intervention, all three options contribute to an appropriate governance for ensuring online safety and the protection of consumers from illegal offerings.

215All options would significantly improve the baseline, making sure that all types of illegal content, goods and services can be flagged in a harmonised manner across the Union. This would ensure a coherent horizontal framework instead of the currently inconsistent approaches relying on the private policies set by online platforms or the regulatory and self-regulatory efforts in Member States or at EU level. It would also ensure that cooperation with law enforcement, national authorities and other trusted flaggers is appropriately accelerated, improving the ability of authorities to tackle cybercrimes and other online crimes. In certain cases, this would lead to positive effects on their ability to protect the right to life and security of individuals.

216The second and the third option would also stimulate online platforms to take additional measures, proportionate to their capability, adapted to the issues and illegal content they most likely host, and in full respect of fundamental rights. 12% 126 of the service providers responding to the open public consultation reported that they used automated systems for detecting illegal content they host, and the same percentage had policies against repeat offenders on their platform. Voluntary measures for tackling illegal content have proven effectiveness at scale. At the same time, such measures continue to be prone to errors, both in under- and over- identifying content. The two options do not only provide for legal clarity for service provider to enforce their measures, but they also establish a missing due process around such processes. The two options set stronger safeguards through transparency and accountability, when private companies take such detection measures. The third option would in addition ensure a higher level of supervision of the effectiveness as well as the pitfalls and errors in the content moderation put in place by platforms, with a particular focus on very large platforms.

217The second and the third options would also tackle systemic risks posed by online platforms through the way they prioritise and accelerate the distribution of content and information. They would both correct information asymmetries and empower citizens, businesses and other organisations to have more agency in the way they interact with the environment and information intermediates by platforms. This would also put consumers in a better informed position for making choices, be it in buying goods and contracting services, or simply in consuming information online.

218The third option, however, would include a much stronger accountability mechanism taking into account the disproportionate influence of very large platforms specifically, ensuring access to researchers and appropriately resourced competent authorities to relevant information allowing them to assess the platforms measures taken in co-regulatory processes to address the risks.

219As the COVID-19 crisis and incidents of viral spread of illegal content have shown, crisis situations can manifest online, presenting systemic risks on platforms which reach millions and requiring coordinated interventions. The third option would also include a framework for establishing such cooperation in crisis situations through setting up crisis protocols, together with the appropriate checks and balances for both platforms and authorities.

Enforcement and supervision by authorities

220A first notable impact, already explained in section 6.2.1, is the improved ability of law enforcement and authorities to supervise and tackle online crimes.

221In addition, all three options entail an important impact and capital improvement as compared to the baseline, in establishing the competence for authorities to supervise not only the incidence of illegal activities online and systematic failure to respond to notices, but also the performance of the notice and action and broader moderation systems in protecting users and avoiding over-removal of legal content. They would all allow designated authorities to request appropriate interim measures, where failures are observed, and eventually apply proportionate and dissuasive sanctions for systematic non-compliance with the due diligence obligations. Ultimately, where all other means fail and where there are severe consequences from the systematic non-compliance, implying the threat of life and security of persons, and following the decision of a court, blocking measures can be applied. The broad spectrum of measures will allow authorities to effectively supervise and enforce the rules, and would remain proportionate by applying gradually and allowing the service provider to take corrective measures to cease the infringement and, under any circumstance, make use of established appeal mechanisms. The reinforced coordination through the national Digital Coordinators in option 2, and the EU competence in option 3, would each significantly increase the coherence and capacity of authorities to supervise and calibrate the imposed measures.

222Importantly, the second and the third option each harmonise conditions for court and administrative orders requesting removal of content. This should facilitate the actions of the authorities and lead to better enforcement overall. In addition, option 3 further facilitates the ability of national authorities to supervise services (such as accommodation or transport services) offered through the intermediation of online platforms.

223The second option gives more agency to users through more robust transparency tools, and the third option sets the highest standard of supervision for all content moderation mechanisms, as well as online advertising and recommender systems.

6.3.Impacts on fundamental rights

224The protection of fundamental rights is one of the main concerns in the online environment, marked by the complexity of interests at stake and the need to maintain a fair balance in mitigating risks. This assessment played a core part in the consideration of the wider range of options and determined the discarding of several options. 127

225All three of the retained options are generally well balanced and are not expected to have a negative impact on fundamental rights. The main differences between the options are rather linked to the extent of their effectiveness in safeguarding fundamental rights and their ability to continue to offer a ‘future proof’ due process faced with the evolving risks emerging in a highly dynamic digital environment.

226All three options would also include a requirement to companies to adopt a fundamental rights standard when implementing the due diligence obligations set by the intervention. This would require services to assess and manage risks in a proportionate and appropriate manner.

227Where option 3 requires a regular risks assessment from very large online platforms, this equally includes an assessment of the way the platforms’ systems or use affect the protection of fundamental rights such as freedom of expression, right to private life, non-discrimination or rights of the child. Consequently, they have to adapt the design of their systems and take appropriate measures to address significant risks, without prejudice to their business freedoms. Further, in the design of codes of conduct and crisis protocols under this option, such requirements will continue to apply, and appropriate checks and balances are to be set up, notably through reporting and transparency commitments from all participants, including authorities involved, participation and scrutiny from civil society and academia, and, finally supervision by the EU board and national authorities.

228The fundamental rights most clearly touched upon by the intervention are the following:

Freedom of expression (Art 11 EU Charter of Fundamental Rights)

229Content moderation decisions by private companies, be it in assessing legality or compliance with their own terms of reference, can impede freedom of expression, in terms of freedom to share information and to hold opinions, but also in terms of freedom for citizens to receive information. While the sale of goods might be seen as less related to freedom of expression, speech can also be reflected in goods, such as books, clothing items or symbols, and restrictive measures on the sale of such artefacts can affect freedom of expression. In this context it is important to underline that all three options will only require removal of content that is illegal. Nevertheless, the options also address the need to provide safeguards in the form of complaint systems and transparency requirements that will mitigate negative consequences of services’ removal of content based on their own terms of service.

230None of the options include prior authorisation schemes, and they all prohibit Member States from establishing such requirements for digital services. Such measures can amount to a severe limitation of freedom of expression.

a.Mitigating risks of erroneously blocking speech

231All three options would add substantial improvements to the baseline situation, by imposing mandatory safeguards when users’ content is removed, including information to the user, complaint mechanism supported by the platform, external dispute resolution mechanism. Coupled with transparency reporting and oversight for systematic compliance by authorities, these are key elements for ensuring the safeguards missing in the baseline and ensuring that users’ rights are respected and they are empowered to defend themselves against erroneous sanctions and removals of their content.

232The Court of Justice has repeatedly confirmed that requirements for platforms to deploy automated content moderation ‘could potentially undermine freedom of information, since that system might not distinguish adequately between unlawful content and lawful content, with the result that its introduction could lead to the blocking of lawful communication’ 128 . At the same time, service providers use such tools, not least for enforcing their terms of service, with improving levels of accuracy, but also with significant challenges, including but not limited to the inability of such tools to accurately distinguish context-dependent content. 129  

233None of the option would require the deployment of such tools. Instead, all three options would preserve the prohibition of general monitoring obligations and would, in addition to the baseline, reinforce safeguards for users to seek redress following removals. Importantly, the second and the third option would extend these obligations to services established outside of the Union but targeting the single market.

234The second and the third option would also remove disincentives for European platforms to take measures for tackling illegal content, goods or services shared by their users by clarifying that this does not, in itself, place them outside of the liability exemption for intermediaries; such measures could include the use of automated tools.

235The third option would include an additional important safeguard where very large online platforms are concerned (and where impacts of removals are most severe on users’ rights): it would impose enhanced transparency and reporting obligations on process and outcomes of content moderation, including automated tools, and afford competent authorities with inspection and auditing powers, opening systems for scrutiny also by researchers and experts. It would also include explicitly as part of the mandatory risk mitigation obligations considerations for their users’ freedom of expression, including concerning the way the very large platforms design and maintain their systems. This includes, for example, the design of their recommender systems, but also of the content moderation systems and tools. This would set the highest standard of protection and accountability and maintain a flexible and vigilant possibility to detect and mitigate risks as they emerge.

b.Addressing other chilling effects on speech

236Some evidence shows that highly violent online environments, for example, can have a chilling effect on speech, for instance where there is a proliferation of illegal hate speech or other forms of illegal content. Such chilling effect has been reported to e.g. risk influencing individuals’ rights to political participation 130 . All options would empower users to report illegal content and support a safer online environment (see section 6.2.1 above).

c.Stimulating freedom to receive information and hold opinions

237All three options would affect the freedom to receive information by ensuring that legal content, of general interest to users, is not inadvertently removed, as explained in paragraphs 231 to 233 above. In fostering exchanges of information, this can also have spill-overs on users’ freedom of assembly and association.

238In addition, the second option would further empower users to better understand and control their online environment through transparency measures concerning recommender systems and online advertising. This is particularly important in allowing citizens to participate in democratic processes or empowering consumers to make informed choices. The third option would establish a higher standard of accountability for those platforms fostering a ‘public space’ for speech and commercial exchanges, by imposing asymmetric obligations for transparency and oversight of such systems, and providing for a more impactful and targeted effect.

User redress

239The three options would have a fundamental impact on users’ possibilities to challenge decisions by platforms, both in what concerns citizens, and businesses. This is one of the biggest impacts of the intervention, compared to the baseline, ensuring a fair governance and empowering users to exert their rights.

240All three options include a complaint and redress mechanism, staged in two steps: obligation to offer and process such complaint by the service provider and availability of out of court dispute settlement mechanisms, expected to absorb escalated issues and to resolve them in a faster and less resource intensive manner than court proceedings. Users would always be able to appeal to the court system, in accordance with the applicable rules of national law.

241The enhanced transparency provisions, making users aware of the policy applied to hosted content, goods or services, as well as the specific information to the user, once corrective action is taken against them, are sine qua non conditions for an effective remedy. All three options would ensure such a standard and would also sanction systematic failure from service providers to provide redress mechanisms. In addition, where the third option affords enhanced supervisory powers for authorities regarding very large online platforms, where users’ rights can be most severely affected, this additionally supports users’ right to remedy.

242Finally, in what concerns restrictions potentially imposed by authorities, the established judicial remedy options would always be available to service providers, as well as to platforms’ users whose content/goods/services are subject to such requests. The enhanced cooperation mechanism across authorities, set up in option 2 and, to a larger extent, in option 3 would further strengthen the checks and balances and the availability of redress in this regard.

Non-discrimination (Art 21 of the Charter), equality between women and men (Art 23) and the right to human dignity (Art 1)

243All three options would have a positive impact in mitigating risks for persons in vulnerable situations and vulnerable groups to be exposed to discriminatory behaviours and would protect the right to human dignity of all users of online services. This concerns first a disproportionately unsafe online environment. In this regard, each option would have different strengths of impact, as assessed in section 6.2.1 above.

244Second, such groups or individuals could be overly affected by restrictions and removal measures following from biases potentially embedded in the notification system by users and third parties, as well as replicated in automated content moderation tools used by platforms. In addition, to the extent that the second and the third option also include a clarification of the liability exemptions with regard to voluntary measures taken by service providers to tackle illegal activities, it is possible that more service providers would voluntarily engage in content moderation. Currently, in particular for large online platforms, such voluntary measures also include the use of content detection technologies, algorithms predicting abusive user accounts or other filtering technologies. Each of these technologies and the way they are designed, trained, deployed and supervised in specific cases, present different risks for non-discrimination and gender equality, but also to the protection of personal data and privacy of communications.

245The three options would address the risks by affording safeguards aimed at improving the possibility for contesting such restrictions (as per 6.3.2). In addition, the third option offers enhanced inspection powers to national authorities for the content moderation processes of very large platforms, where the impact of discriminatory practices can be most acute.

246The second and the third option would cater for broader discrimination concerns emerging in the way platforms amplify information, and access to goods and services: they would include transparency provisions for recommender systems and placement of online ads, empowering users to understand and have agency over how they are affected by these systems.

247The enhanced transparency and oversight measures included in the third option for content moderation, recommender systems and online advertising through very large online platforms would be particularly impactful in offering the means for detecting discriminatory practices and allowing these issues to surface on the policy and public agenda.

Private life and privacy of communications (Art 7 of the Charter) and personal data protection (Article 8 of the Charter)

248Nothing in the intervention should prejudice the high standard of personal data protection and protection of privacy of communications and private life set in EU legislation. All measures following from either one of the three options should be fully compliant and aligned.

249Furthermore, all measures are aimed to enhance users’ online safety and can be expected to contribute to better responding to illegal content and activities, including content consisting of the non-consensual sharing of users’ private data, including images.

250For all three options, obligations to set up a ‘Know Your Business Customer’ policy and collect identification information from traders, as well as obligations to for the identification of advertisers would likely imply processing and disclosure of personal data. However, these measures are limited to traders, and do not concern other users of online platforms. With regard to the data requested for traders under the ‘know your business customer’ obligations, the requirements are limited to the minimum necessary, as established in other similar regulatory initiatives 131 and best practices in industry 132 .

251Similarly, where option 3 requires further reporting to national authorities, this can entail disclosure of personal data of users of platforms (e.g. accommodation service providers, sellers of goods). This is a necessary measure for protecting the public interest and the protection of consumers online, and remains proportionate by limiting the requirement to data already collected by the platform. It does not cover in any way requirements for citizens using online services to identify themselves. If personal data is part of the request, the requirement would offer a legal basis for data processing by the service provider in line with Article 6 (1) c) of the GDPR and would require Member States to specify the conditions for data processing by the requesting authorities, in the national laws laying the competence for such authorities to issue requests.

252Where option 3 requires from large online platforms to facilitate data access for audits and investigations by researchers, such measures should be designed based on an appropriate assessment of risks, in line with GDPR requirements, potentially with the involvement of Data Protection Authorities, and should be organised with the least invasive approach and proportionate costs, exploring options for secure access or protected access.

253Where option 3 requires service providers to notify to authorities suspicions of serious criminal offences, this is proportionate and justified by the seriousness of the offence and the public interest entailed. At the same time, the provision does not, in itself provide for a legal basis to process personal data of users of the platform, with a view of possible identification of criminal offences.

254Transparency and disclosure requirements included in option 2, as well as requirements regarding the maintenance of ad archives in option 3 are not intended to lead to any additional or disclosing of personal data of the users who had seen the ads; they might include personal data of the advertiser, acting as a trader, and personal data already publicly disclosed in the content of the ad.

255As regards risks posed by automated content moderation and other technologies voluntarily used by platforms for tackling illegal behaviours of their users – see paragraph 244 above – it is understood that a case by case assessment is necessary and, when service providers develop and deploy such tools, they must do so in observance of the rights and obligations established in the GDPR and the ePrivacy Directive 133 . None of the three options would affect in any way this requirement and they do not mandate the use of any automated tool for detection of content. Option 3 would instead potentially create additional opportunities for inspecting compliance in this regard. It also includes considerations for the right to private life in the risk assessment framework very large platforms are subject to.

256All options include obligations for redress and complaint mechanisms; they imply that the content removed should be preserved by the service provider for the reasonable duration of such potential proceedings, allowing them, where necessary, to reinstate the content. Such measures have the sole purpose of enabling a ‘due process’ approach following a removal decision and are proportionate to the rights and interests of the content provider, data subjects whose personal data might be retained, and the service provider, which incurs very limited costs for the storage of data for a limited period of time.

Rights of the child (Art 24 of the Charter)

257All three options would have a positive influence in protecting the safety of children online. Consistent, with the analysis in section 6.2.1 the positive impact is strengthened with each option. Option 3 explicitly includes rights of the child as a primary consideration when very large platforms assess the systemic risks posed by the design of their service and take appropriate measures to uphold the best interest of children.

Right to property (Art 17 of the Charter) and freedom to conduct a business (Art 16)

258All three options would have a similarly positive impact on the right to property by complementing existing rules addressing the violation of intellectual property.

259None of the measures in either one of the options should jeopardise the protection of trade secrets or proprietary products of online platforms. Where, in option 3, further requests for disclosure could be made by authorities to very large online platforms, these would entail a secrecy obligation on the public authority with regards to trade secrets.

260All the three options will imply compliance costs and adjustments of the business processes to regulatory standards for the platforms. This limitation to the right to freedom to conduct a business is proportionate and will be mitigated and most likely be fully compensated by the fact that the measures will lead to significant cost savings compared to the baseline, in particular in light of the evolving legal fragmentation. Costs are also tailored to be proportionate to the capacity of the given service provider.

6.4.Environmental impacts

261Environmental impacts are expected to be relatively marginal for all options compared to the baseline. This is not to say that the environmental impact of digital services will not be important to monitor. Substantial factors will depend on technological evolution, business choices in manufacturing and development chains, consumer behaviour and nudging, etc.

262Digital services are not only energy consumers themselves and generators of digital waste, but are also underpinning services and distribution of goods which have themselves an important environmental footprint – including transport, travel and accommodation, etc. However, the three options are primarily expected to shift the focus towards responsible digital services, with a marginal impact on the overall demand of digital services. This makes it difficult to estimate with sufficient intervals of confidence a causality between the adoption of either three of the policy options and the environmental impacts of digital services.

263In addition, many illegal activities are also related to intense polluting – see, in particular, the case of counterfeit products 134 or the manufacturing of dangerous products or the sale of products that do not comply with EU environmental or energy-saving rules (e.g. eco-design, energy labelling, etc.). A reduction in the ability to place on the European market such products might also reduce in their production. The due diligence obligations would also equally concern non-compliance with the extended responsibility requirements in online sales 135 .

7.How do the options compare?

7.1.Criteria for comparison

264The following criteria are used in assessing how the three options would potentially perform, compared to the baseline:

-Effectiveness in achieving the specific objectives:

I.Ensure the best conditions for innovative cross-border digital services to develop

II.Maintain a safe online environment, with responsible and accountable behaviour from online intermediaries

III.Empower users and protect fundamental rights online, and freedom of expression in particular

IV.Establish the appropriate supervision of digital services and cooperation between authorities

-Efficiency: cost-benefits ration of each policy options in achieving the specific objectives

-Coherence with other policy objectives and initiatives:

a.Within the Digital Services Act Package, coherence with the second initiative

b.Other, sector-specific instruments, such as the AVMSD, the DSM Copyright Directive, the proposed Regulation on terrorist content

c.Coherence with Internet principles and the technical infrastructure of the internet 136

-Proportionality: whether the options go beyond what is a necessary intervention at EU level in achieving the objectives

7.2.Summary of the comparison

265Summary of the comparison of options against the four criteria is included below. The table visualising the comparison of options should only be read in vertical, ‘+’ pointing to a better performance of the option than the baseline, and ‘++++’ to the best performance among the options; the ‘>’ symbol is used to indicate higher costs than the baseline, and ‘>>>>’ the highest cost among the options.

Table 6 Comparison of options

Effectiveness

Efficiency

Coherence

Costs

Benefits

a

b

c

Baseline

~

~

~

~

~

~

Option 1

+

>

+

+

+

+

Option 2

++

>>

++

++

+

+

Option 3: Sub-option 3.A

+++

>>>

+++

+++

+

+

Option 3: Sub-option 3.B

+++

>>>>

++++

+++

+

+

266Scores on effectiveness build on the extent to which the impacts screened in section 6 contribute to the achievement of the specific objectives. Scores on costs cumulate here both costs on service providers and on public authorities.

Impacts assessed

Baseline

Option 1

Option 2

Option 3

Economic impacts

Functioning of the Internal Market and competition

~

+

++

+++

Costs and administrative burdens on digital services

~

>

>>

>> 137 / >>> 138

Competitiveness, innovation, and investment

~

+

++

+++

Costs for public authorities

~

>

>>

>>>

Trade, third countries and international relations

~

+

+

+

Social impacts

Online safety

~

+

++

+++

Enforcement and supervision by authorities

~

+

++

+++

Fundamental and rights (as laid down in the EU Charter)

Freedom of expression (Art 11)

~

+

++

+++

Non-discrimination, equality, dignity (Art 21, 23,1)

~

+

++

+++

Private life and privacy of communications (Art 7 )

~

+

+

++

Personal data protection (Article 8)

~

~

~

~

Rights of the child (Art 24)

~

+

++

+++

Right to property (Art 17)

~

+

+

+

Freedom to conduct a business (Art 16)

~

+

+

+

User redress

~

+

++

++

Overall

~

+

++

+++

Effectiveness

7.2.1.1.First specific objective: ensure the best conditions for innovative cross-border digital services to develop

267The comparison of options against the first specific objectives rests primarily on the economic impacts of the options on service providers.

268The first option improves the conditions for innovative online platforms to emerge in the Union by harmonising across the single market the due diligence obligations imposed on platform services for tackling illegal activities of their users. It also has positive impacts on hosting service providers and online intermediaries. The first option requires costs for service providers, in particular online platforms, but these remain proportionate to the capacities of the companies. The most significant costs are variable costs from the running on the notice and action system, and, consequently, they are proportionate to the risks services providers bring.

269It answers to the most acute and current concerns Member States are raising at this point in time and improves innovation opportunities in the short term. It would also establish a level playing field between European companies and services offered from outside the Union, otherwise not subject to the same rules and costs when targeting European consumers. The positive impacts of the option with regard to addressing the legal fragmentation in the single market might not endure in the medium- to longer-time horizon, since it harmonises the core, yet limited set of measures and relies on case law and self-regulatory measures for addressing emerging concerns.

270The second option would significantly improve the effectiveness of the intervention by providing more legal certainty to all online intermediaries and removing disincentives for service providers to protect their services from illegal activities. This can bring relief in particular to innovative start-ups and small service provider. Compared to the first option, the second option will further improve the mechanic in the cooperation and trust between Member States authorities through a reinforced and more agile cooperation mechanism.

271The third option would similarly significantly improve the conditions for the provision of services in the single market. It would establish a European governance system for the supervision and enforcement of rules fit for solving emerging issues and, importantly, able to appropriately detect and anticipate them. This should maintain a long-lasting trust and cooperation environment between Member States and offer technical assistance to ensure the best supervision of services across the Union. It would also calibrate these efforts and target them towards those services producing the biggest impacts. Overall costs for the majority of companies would remain comparable to those in option 2. This would also ensure proportionality of measures, to create the necessary space for start-ups and innovative companies to develop.

7.2.1.2.Second specific objective: maintain a safe online environment, with responsible and accountable behaviour from online intermediaries

272The first option would bring a significant improvement to the baseline, in establishing the core measures for tackling illegal activities online and ensuring a consistent level of protection across all services and covering all types of illegal behaviours.

273The second option would be expected to produce strong effects in this regard by stimulating targeted and appropriate measures from service providers. Importantly, it would offer an even stronger and responsive cooperation across Member States, supporting the protection of all European citizens both when online intermediaries or other digital services are concerned. It would also extent the scope of concerns tackled by empowering users to better interact with the platforms’ environment, e.g. with regard to ads they see on online platforms.

274For the third option, in addition to the features of option 2, would include stronger obligations and significantly more robust oversight on very large online platforms. This targets a stronger intervention towards service providers where the highest societal risks emerge, while ensuring that smaller online platforms can effectively address illegal content emerging on their services and can also be part on a voluntary basis of codes of conduct. The flexible co-regulatory environment to address in an adapted and speedy manner all emerging issues, would ensure that urgent, palpable results can be achieved, including in crisis situations. This would also be coupled with an effective and well calibrated European governance for enforcement and supervision. The overall effectiveness of sub-option 3.A and 3.B are comparable in this regard, while it is expected that, in the longer-term, option 3.B could deliver a more robust framework for intervention, whereas option 3.A providers for an immediately functional and effective enforcement structure.

7.2.1.3.Third specific objective: empower users and protect fundamental rights online, and freedom of expression in particular

275The first option would significantly improve the current situation by affording users with the necessary due process rights and provisions for defending their rights and interests online.

276The second option would in addition give users more agency and information online (e.g. with regard to recommended content or ads online) and an overall better environment for seeking information, for making choices, for holding opinions and participating in democratic processes.

277The third option would importantly create a risk management framework that includes considerations for fundamental rights, including freedom of expression, where very large platforms are concerned. This ensures that, in particular in those ‘public spaces’ for exchanges, more robust safeguards are in place. This approach is also accompanied by adapted supervisory powers and capabilities for authorities in the context of a solid European governance. This would ensure both that issues are detected, and a co-regulatory framework for solving them as they emerge.

7.2.1.4.Fourth specific objective: establish the appropriate supervision of digital services and cooperation between authorities

278The first option would enhance the baseline by establishing a common regulatory benchmark against which Member States can supervise online platforms and further streamlining the cooperation process for supervising the due diligence obligations on online intermediaries.

279The second option would further enhance the supervision of all digital services and would offer a robust platform for cooperation across Member States as well as within each Member State.

280The third option would offer an effective mechanism for supervision and cooperation, fit to anticipate future problems and address them effectively. This would rest on a European governance, ensuring that information and capability asymmetries between authorities and platforms are not impeding on effective supervision. It would afford the appropriate oversight powers to authorities and facility access to information to researchers ensuring that issues can be detected as they emerge. Under sub-option 3.A, an agile supervisory system would be set up immediately within the Commission, coupled with a tight structure for the exchanges between Member States’ new digital services coordinators. Under sub-option 3.B, the supervisory structure with an EU body would give statutory powers to the EU Board.

Efficiency

281The costs for each of the three options are proportionate to their effectiveness in achieving the four specific objectives.

282The first option comes with lower costs on service providers and an expectation for higher costs on authorities to ensure a better supervision than the current situation, while creating significant efficiency gains in the cross-border cooperation.

283The second option entails similar costs for service providers and is expected to lead to comparable costs on authorities, including efficiency gains through the cooperation system. At the same time, the option is globally more effective than the first option at comparable costs.

284The third option is similarly costly for all digital services, but requires higher compliance costs from a relatively small number of very large platforms. In what concerns authorities, it includes significant efficiency gains thanks to the cooperation mechanism, as well as higher costs for the effective supervision of services, including at EU level. In sub-option 3.A, a series of costs are streamlined by absorbing into the Commission’s structure most of the investigative, advisory and enforcement EU-level powers. Under sub-option 3.B, the overall costs are higher, since the new agency needs to ensure its own administrative operations and does not directly benefit from the wider pool of expertise of the Commission. .

Coherence

7.2.1.5.With the Digital Markets Act

285This initiative is coupled with an intervention to ensure a competitive digital economy and in particular fair and contestable markets. The Digital Markets Act intervention focuses on large online platforms, which have become gatekeepers and whose unfair conduct in the market may undermine the competitive environment and the contestability of the markets, especially for innovative start-ups and scale-ups.

286Both initiatives contribute to shared objectives of reinforcing the single market for digital services, improving the innovation opportunities and empowering users, and improving the supervision over digital services. They complement each other in covering issues which are different in nature. The two initiatives should also reinforce each other in what concerns those very large online platforms falling in scope of both sets of measures, in particular in what concerns empowering users, but also in correcting business incentives for acting responsibly in the single market.

287The definition of very large platforms falling in scope of the asymmetric obligations in option 3 is different in nature and scope from the ‘gatekeeper’ platforms considered for the Digital Markets Act. For the latter, the criteria will relate to the platforms’ economic power in the market place while in the case of the option 3 analysed here, large platforms are understood as those which serve as de facto public spaces in terms of numbers of users. Consequently, not all very large platforms are expected to also be gatekeeper platforms, but many will likely fall also in that category under the Digital Markets Act.

288 All three options are fully coherent with the second initiative. The second and, to a larger extent, the third one, are further complementary with the second intervention, in particular by enhancing transparency and user agency with regard to core features of online platforms such as recommender systems and online ads.

7.2.1.6.Other, sector-specific instruments

289The objectives of the instrument are fully aligned with the sector-specific interventions adopted and/or proposed by the Commission, such as the AVMSD, the Copyright Directive, and the proposed Regulation on terrorist content. Each of the three options would complement these initiatives, but would not seek to modify them.

290For example, measures on all the proposed options would complement the obligation of a notification system set for video-sharing platforms in the AVMSD with more detailed requirement, with regard to transparency obligations and user complaints, and extending their application horizontally to all types of online platforms and for all types of illegal content.

291The Copyright Directive would remain a lex specialis with regard to the liability exemptions for certain types of platforms. At the same time, certain new obligations in the options, such as a harmonised notice and action procedure as well as various transparency obligations, will further enhance enforcement of the copyright acquis and help the fight against online piracy.

292The three options are also fully compatible and coherent with the Platform to Business Regulation. In particular where the redress and complaint mechanisms for business users restricted by the platform is aligned with the provisions in the three options, and the Regulation allows for exceptions from the conditions its sets for restrictions on the business user of an online intermediation service in connection to illegal activities (see recital 23).

293All the options would also provide for an effective cooperation and supervision system, with different degrees of impacts (see 6.1.1 and 6.2.2) which could further support sector-specific cooperation.

7.2.1.7.Coherence with Internet principles and the technical infrastructure of the internet

294All three options are fully aligned and reinforce the principles of the open internet and the technical infrastructure of the network. This supports both the competitiveness of these sectors, but also, importantly, their resilience and their role in maintaining an open internet and protect their users’ rights.

Proportionality

295The three options follow the same principle of proportionality and necessity of an intervention at EU level: a fragmented approach across Member States is unable to ensure an appropriate level of protection to citizens across the Union, and the supervision of services would remain inconsistent. However, the effectiveness and proportionality of the third option in reaching the objectives is superior, not least in light of a future-proof intervention, allowing the supervisory system to respond to emerging challenges linked to the supervision of digital services and preventing future re-fragmentation of rules. Where the third option imposes sanctions, these are proportionate to the harms posed by the very large platforms concerned.

8.Preferred option

296Against this assessment, the preferred option recommended for political endorsement is the third option. This option would best meet the objectives of the intervention and would establish the proportionate framework fit for adapting to emerging challenges in the dynamic digital world. It would set an ambitious governance for digital services in Europe and would reinforce the single market, fostering new opportunities for innovative services.

297It would also appropriately manage systemic risks which emerge on very large platforms, while establishing a level playing field for smaller players – both in terms of setting a core set of obligations to make sure online safety and fundamental rights are consistently protected online, and in making sure that all services targeting the European single market comply with the same standards of protection and empowerment of citizens.

298The preferred option, while preserving the geographical scope of the E-Commerce Directive for its core provisions, would in addition set a gradual and proportionate set of due diligence obligations for different digital services, also applicable to services established outside the Union but offering services in the single market, as follows:

INTERMEDIARIES

HOSTING SERVICES

ONLINE PLATFORMS

VERY LARGE PLATFORMS

Transparency reporting

Requirements on terms of service and due account of fundamental rights

Cooperation with national authorities following orders

Points of contact and, where necessary, legal representative

 

Notice and action and information obligations

 

 

Complaint and redress mechanism and out of court dispute settlement

 

 

Trusted flaggers

 

 

Measures against abusive notices and counter-notices

 

 

Vetting credentials of third party suppliers (“KYBC”)

 

 

User-facing transparency of online advertising

 

 

 

Risk management obligations

External risk auditing and public accountability

Transparency of recommender systems and user choice for access to information

 

 

 

Data sharing with authorities and researchers

 

 

 

Codes of conduct

 

 

 

Crisis response cooperation

299Sub-option 3.A is recommended as the preferred option for the EU-level governance by virtue of its speedy feasibility and urgent application, with a comparable effectiveness to 3.B in the short to medium term.

Figure 6 Intervention logic for the preferred option

9.REFIT (simplification and improved efficiency)

Table 7 REFIT cost savings for the preferred option

REFIT Cost Savings – Preferred Option(s)

Description

Amount

Comments

Coordination and cross-border cooperation costs for national authorities will be significantly streamlined through the Clearinghouse system and the EU body

Quantitative estimates cannot be clearly established, as current costs vary from one MS to another, and gains and expenditure under the preferred option will depend on the MS’ supervisory role for digital services and volume of requests to be processed

Concerns mostly national authorities in Member States

Core elements of the harmonising measures: due diligence obligations for online intermediaries

Between EUR 400.000 and EUR 15 mil for a medium-sized company, per year

Concerns hosting service providers, in particular online platform companies established in the Union

10.How will actual impacts be monitored and evaluated?

300The establishment of a robust system for data collection and monitoring is in itself one of the core impacts pursued by the preferred option. This includes both the enhanced ability to monitor and account for the functioning of the cooperation across Member States’ authorities, and the supervision of digital services.

301Several monitoring actions should be carried out by the Commission, in evaluating continuously the effectiveness and efficiency of the measures.

Table 8 Summary of monitoring actions and indicators

Specific objectives

Operational objectives

Key performance Indicators

Monitoring and indicators

1.Best conditions for innovative, cross-border digital services to develop

Harmonised application of due diligence obligations for online platforms

Legal certainty and consistency in enforcement with regard to the due diligence obligations and the legal clarity in the liability regime for online intermediaries

Mitigate and prevent further burdensome legal fragmentation for digital services

Numbers and diversity of infringements and services concerned

Number of derogation requests from MS (target: none – to be monitored )

Number of laws adopted derogating (target: none)

EU start-ups and SMEs emerging in the single market;

Economic indicators for cross-border trade (measured against projected increase of 1% to 1.8%)

Monitored through the reported data from the Clearing house system, with qualitative indications based on requests for assistance from Member States, response rates and resolutions.

Reports from Member States through the cooperation under the EU Board

Monitoring of the evolution of CJEU case law, national case law and complaints resolved in out of court dispute resolution mandated by the act.

Monitoring co-regulatory frameworks launched under the, their reported outcomes and the extent to which they address the underlying concerns and cover all relevant digital services and social partners.

2.Safe online environment, with responsible and accountable behaviour from digital services

Effective application of the due diligence obligations by service providers

Effective actions by law enforcement

Strong stakeholder views, in particular from civil society, that stringent content removal KPIs incentivise over-removal of content. No unattainable ‘zero tolerance’ target

Specific KPIs set for each co-regulatory framework

Number of negative audits

Data reported by Member States supervising the systemic compliance of service providers – as collected through the Clearinghouse

Number, complexity and effectiveness of cases pursued at EU level

3.Empower users and protect fundamental rights online

Compliance from service providers with due diligence and transparency obligations

Investigations, audits and data requests from authorities, researchers and independent auditors

Number of complaints for content removal escalated to out of court disputes and authorities and leading to reinstatements

Number of negative audits

Data reported by Member States through the Clearing house

Monitoring of transparency reports, ad archives and compliance with specific requests from authorities and independent audits of service providers

4.Appropriate supervision of digital services and cooperation between authorities

Effective supervision and enforcement by Member State of establishment

Responsive and effective cross-border cooperation

Response time from Digital Services Coordinators to requests from other Member States (target: no more than 10% over the 1 month deadline)

Monitored through the reported data from the Clearing house system, with qualitative indications based on requests for assistance from Member States, response rates and resolutions.

Reports from Member States through the cooperation of the EU Board

302The legal act would set the overall legal framework for digital services. It should be designed to remain valid in the longer term, allowing for sufficient flexibility to address emerging issues. Consequently, it does not necessitate a short-term review clause in itself.

303Instead, the effectiveness of the instrument is likely to be strictly dependent on the forcefulness of its enforcement. For digital services to behave responsibly and for the framework of the single market to be a nourishing environment for innovative services, establishing and maintaining a high level of trust is paramount. This concerns as much the Member State level supervision of digital services, as the cross-border cooperation between authorities, and, where necessary, infringement procedures launched by the Commission. Yearly activity reports of the EU Board should also be compiled and made publicly available, with sufficient information on its operation and the cooperation and outcome indicators as presented in the table here-above. An evaluation of the instrument should be conducted within five years from the entry into force.

(1)

  https://ec.europa.eu/commission/sites/beta-political/files/political-guidelines-next-commission_en.pdf  

(2)

  https://ec.europa.eu/info/sites/info/files/communication-shaping-europes-digital-future-feb2020_en_4.pdf  

(3)

  https://ec.europa.eu/info/sites/info/files/communication-shaping-europes-digital-future-feb2020_en_4.pdf  

(4)

Annex 13 presents a brief summary of the reports and a map of how the impact assessment explores the points raised in the reports

(5)

Council Conclusions on Shaping Europe’s Digital Future, 8711/20 of 9 June 2020, https://www.consilium.europa.eu/media/44389/st08711-en20.pdf

(6)

  https://www.consilium.europa.eu/media/45910/021020-euco-final-conclusions.pdf  

(7)

     Legislation addressing specific types of illegal goods and illegal content includes: the market surveillance regulation , the revised the the , the proposed regulation on preventing the dissemination of terrorist content online , the directive on combatting the sexual abuse and sexual exploitation of children and child pornography , the regulation on the marketing and use of explosives precursors etc. The Directive on better enforcement and modernisation of EU consumer protection rules added transparency requirements for online marketplaces vis-à-vis consumers which should become applicable in May 2022.

(8)

e.g. the EU Internet Forum against terrorist propaganda online, the Code of Conduct on countering illegal hate speech online, the Alliance to better protect minors online under the European Strategy for a better internet for children and the WePROTECT global alliance to end child sexual exploitation online, the Joint Action of the consumer protection cooperation network authorities, Memorandum of understanding against counterfeit goods, the Online Advertising and IPR Memorandum of Understanding, the Safety Pledge to improve the safety of products sold online etc. In the framework of the Consumer Protection Cooperation Regulation (CPC), the consumer protection authorities have also taken several coordinated actions to ensure that various platforms (e.g. travel booking operators, social media, online gaming platforms, web shops) conform with consumer protection law in the EU. A package of measures was also adopted to secure free and fair elections - https://ec.europa.eu/commission/presscorner/detail/en/IP_18_5681

(9)

In the framework of the Consumer Protection Cooperation Regulation (CPC), the consumer protection authorities have also taken several coordinated actions to ensure that various platforms (e.g. travel booking operators, social media, online gaming platforms, and webshops) conform with consumer protection law in the EU https://ec.europa.eu/info/live-work-travel-eu/consumers/enforcement-consumer-protection/coordinated-actions_en.

(10)

See Annex 5 for details about the evaluation of the E-Commerce Directive.

(11)

  Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market

(12)

The term “Digital Service” as used in this document is synonymous with term ‘information society services’, as defined in the E-Commerce Directive and the Transparency Directive 2015/1535.

(13)

From optimisations through network technologies to development of artificial intelligence applications or blockchain technology and distributed data processing.

(14)

40% in 2019 in EU27, according to ESTAT https://appsso.eurostat.ec.europa.eu/nui/submitViewTableAction.do See also (Eurobatometer - TNS, 2016) for more granular data based on a 2016 survey

(15)

(Duch-Brown & Martens, 2015)

(16)

(Iacob & Simonelli, 2020)

(17)

(Eurobarometer - TNS, 2018)

(18)

In the first half of 2019 online advertising pending in Europe amounted to 28.9 billion Euros. The growth rate of online advertising in the same period was around 12.3% ( https://www.statista.com/topics/3983/digital-advertising-in-europe/ ).

(19)

Dealroom database, see infra, p 24 

(20)

  https://www.oecd.org/coronavirus/policy-responses/connecting-businesses-and-consumers-during-covid-19-trade-in-parcels-d18de131/#figure-d1e204  

(21)

Europol, Pandemic profiteering: how criminals exploit the COVID-19 crisis, March 2020, see: https://www.europol.europa.eu/publications-documents/pandemic-profiteering-how-criminals-exploit-covid-19-crisis .

(22)

European Commission, Illegal and Harmful Content Communication, COM(96) 487, pp. 12–13.

(23)

With a proportionality concern, these aspects are succinctly addressed in the impact assessment report, focused instead on the most poignant issues related to the systemic concerns around digital services.

(24)

(OECD/EUIPO, 2019)

(25)

(European Commission, 2019) apud (European Commission, 2020)

(26)

https://ec.europa.eu/eurostat/statistics-explained/index.php/E-commerce_statistics_for_individuals#General_overview  

(27)

https://ec.europa.eu/consumers/consumers_safety/safety_products/rapex/alerts/repository/content/pages/rapex/index_en.htm  

(28)

https://www.beuc.eu/publications/two-thirds-250-products-bought-online-marketplaces-fail-safety-tests-consumer-groups/html .

(29)

https://www.oecd.org/coronavirus/policy-responses/protecting-online-consumers-during-the-covid-19-crisis-2ce7353c/#section-d1e96

(30)

https://ec.europa.eu/info/live-work-travel-eu/consumers/enforcement-consumer-protection/scams-related-covid-19_en  

(31)

https://web.archive.org/web/20190928174029/https://storage.googleapis.com/pub-tools-public-publication-data/pdf/b6555a1018a750f39028005bfdb9f35eaee4b947.pdf   https://www.missingkids.org/content/dam/missingkids/gethelp/2019-reports-by-esp.pdf  

(32)

https://www.inhope.org/media/pages/the-facts/download-our-whitepapers/803148eb1e-1600720887/2020.09.18_ih_annualreport_digital.pdf  

(33)

Illegal hate speech, as defined by the Framework Decision 2008/913/JHA of 28 November 2008 on combating certain forms and expressions of racism and xenophobia by means of criminal law and national laws transposing it, means all conduct publicly inciting to violence or hatred directed against a group of persons or a member of such a group defined by reference to race, colour, religion, descent or national or ethnic origin.

(34)

  https://transparency.facebook.com/community-standards-enforcement#hate-speech  

(35)

By contrast with the EU definition, Facebook defines hate speech as ‘violent or dehumanizing speech, statements of inferiority, calls for exclusion or segregation based on protected characteristics, or slurs. These characteristics include race, ethnicity, national origin, religious affiliation, sexual orientation, caste, sex, gender, gender identity, and serious disability or disease.’ https://transparency.facebook.com/community-standards-enforcement#hate-speech  

(36)

Gesetz zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken (Network Enforcement Act)

(37)

Wagner, Ben, Krisztina Rozgonyi, Marie-Therese Sekwenz, Jatinder Singh, and Jennifer Cobbe. 2020. “Regulating Transparency? Facebook, Twitter and the German Network Enforcement Act.”

(38)

(Eurobarometer - TNS, 2018)

(39)

Report from the IMI system, See also Annex 8

(40)

  https://rusi.org/sites/default/files/20190628_grntt_paper_2_0.pdf  

(41)

See in particular recital 58 of the E-Commerce Directive.

(42)

  https://www.inhope.org/EN   https://www.inhope.org/EN  

(43)

  https://ec.europa.eu/commission/sites/beta-political/files/soteu2018-preventing-terrorist-content-online-swd-408_en.pdf (Concerning Proposal COM/2018/640 final) https://ec.europa.eu/commission/sites/beta-political/files/soteu2018-preventing-terrorist-content-online-swd-408_en.pdf (Concerning Proposal COM/2018/640 final)

(44)

A synthesis of relevant behavioural economy and psychology literature presented in (Lewandowsky & Smillie, 2020 (forthcoming))

(45)

See, for example, (Alastair Reed, 2019) for a study on the recommender systems of three online platforms, pointing to some evidence on how theirs systems could prioritise right-wing extremism, including content that could qualify as illegal terrorist content

(46)

  https://ctc.usma.edu/christchurch-attacks-livestream-terror-viral-video-age/  

(47)

E.g. https://www.vice.com/en_us/article/d3w9ja/how-youtubes-algorithm-prioritizes-conspiracy-theories  

(48)

See, for example, potential trade-offs and welfare losses in using alternatively recommender systems and targeted advertising as marketing strategies in (Iusi Li, 2016) or https://webtransparency.cs.princeton.edu/dark-patterns/  

(49)

Synthesis of the state of the art research in (Lewandowsky & Smillie, 2020 (forthcoming))

(50)

For example (Jausch, 2020) or (Fundacja Panoptykon, 2020)

(51)

See, for instance (Coppock, 2020) on limited effects of political advertising on voted behaviour, and (Jausch, 2020)

(52)

See, for example, (Ali M., 2019) (Datta A., 2018)

(53)

The latest assessment of the Code of practice on Disinformation details the more complex issues and the voluntary actions envisaged. https://ec.europa.eu/digital-single-market/en/news/assessment-code-practice-disinformation-achievements-and-areas-further-improvement This impact assessment does not address specifically, nor exhaustively, the issue of disinformation, but analyses a series of structural characteristics of online platforms which fuel such risks, along with other societal harms.

(54)

See, for example, (Leerssen, 2020), or (Cobbe & Singh, 2019)

(55)

https://ec.europa.eu/info/policies/consumers/consumer-protection/evidence-based-consumer-policy/market-monitoring_en  

(56)

E.g. voluntary partnerships with academics such as https://socialscience.one/ or reporting in the Code https://ec.europa.eu/digital-single-market/en/news/annual-self-assessment-reports-signatories-code-practice-disinformation-2019 . Other cases concern platforms’ own intentions to crowdsource the optimisation of its recommender systems, https://netflixtechblog.com/netflix-recommendations-beyond-the-5-stars-part-2-d9b96aa399f5  

(57)

A short selection of examples includes: https://algotransparency.org and https://foundation.mozilla.org/en/campaigns/youtube-regrets/ for YouTube recommender systems, http://insideairbnb.com/about.html for data on AirBnB listings

(58)

On the need for continuous, structural monitoring, see e.g. (LNE, forthcoming)

(59)

(Urban, 2017)

(60)

https://www.theguardian.com/technology/2019/jun/06/youtube-blocks-history-teachers-uploading-archive-videos-of-hitler  

(61)

  https://www.wired.co.uk/article/chemical-weapons-in-syria-youtube-algorithm-delete-video

(62)

  Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services 

(63)

  Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services

(64)

Some Member States have put in place voluntary mechanisms in partnership with individual platforms (e.g. Polish memorandum with Facebook) whereas others have included complaint mechanisms in their ‘notice and action’ national laws (see Annex 6)

(65)

(Eurobarometer - TNS, 2018)

(66)

(Penney, 2019)

(67)

(Matias, 2020)

(68)

See Recital 22 of the E-Commerce Directive and further explanations in (Crabit, 2000)

(69)

Simulation of costs based expenditure data from publicly available reports from companies complying with the requirements in the NetzDG. See Annex 4

(70)

Conservative estimates based on data available in the Dealroom database for ‘hosting services’ having received some venture funding or other external investment (September 2020)

(71)

See annex 4 for an explanation of the model

(72)

(Dealroom, 2020)

(73)

(Urban, 2017)

(74)

(Madiega, 2020)

(75)

(Eurobarometer - TNS, 2018)

(76)

(European Commission, 2020)

(77)

  Commission Recommendation (EU) 2018/334 of 1 March 2018 on measures to effectively tackle illegal content online

(78)

Most prominently, https://www.oversightboard.com/  

(79)

  https://www.ohchr.org/Documents/Issues/Opinion/Legislation/OL_OTH_01_05_19.pdf  

(80)

https://ec.europa.eu/eurostat/web/products-eurostat-news/-/EDN-20190629-1

(81)

https://www.statista.com/statistics/1090015/use-of-social-networks-among-companies-europe/

(82)

Amazon, for example, is reported to host over 1.1 million business users in Europe cf. https://ecommercenews.eu/amazon-has-1-1-million-active-sellers-in-europe/  

(83)

See Annex 4

(84)

Statista, based on IHS and IAB Europe data

(85)

https://theconversation.com/facebook-algorithm-changes-suppressed-journalism-and-meddled-with-democracy-119446  

(86)

Avaaz (2019) ‘Why is YouTube Broadcasting Climate Misinformation to Millions?’ Available at: https:// secure.avaaz.org/campaign/en/youtube_climate_misinformation/ 19 Avaaz (2020) ‘How Facebook Can Flatten the Curve of the Coronavirus Infodemic’. Available at: https:// avaazimages.avaaz.org/facebook_coronavirus_misinformation.pdf

(87)

A practice of manipulating messages or online campaigns, making it appear like they are stemming from ‘grassroots’ initiatives and supported by genuine participants, whereas they are sponsored and promoted centrally by organisations hiding their affiliation and financial link with the initiatives

(88)

  https://www.ivir.nl/publicaties/download/Report_Disinformation_Dec2019-1.pdf  

(89)

  LOI n° 2020-766 du 24 juin 2020 visant à lutter contre les contenus haineux sur internet

(90)

  Entwurf eines Bundesgesetzes über Maßnahmen zum Schutz der Nutzer auf Kommunikationsplattformen (Kommunikationsplattformen-Gesetz - KoPl-G)

(91)

  Entwurf eines Zweiten Gesetzes zur Änderung des Jugendschutzgesetzes ; unofficial consolidated text: https://gameslaw.org/wp-content/uploads/Youth-Protection-Act-Draft-10.-Feb-2020.pdf  

(92)

  DECRETO-LEGGE 24 aprile 2017, n. 50

(93)

  Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC

(94)

  Regulation (EU) No 98/2013 of the European Parliament and of the Council of 15 January 2013 on the marketing and use of explosives precursors

(95)

  Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011

(96)

See annex 12 for a more detailed description of EU law framing online advertising

(97)

https://ec.europa.eu/info/policies/justice-and-fundamental-rights/combatting-discrimination/racism-and-xenophobia/eu-code-conduct-countering-illegal-hate-speech-online_en  

(98)

  https://ec.europa.eu/info/business-economy-euro/product-safety-and-requirements/product-safety/product-safety-rules_en  

(99)

https://ec.europa.eu/growth/industry/policy/intellectual-property/enforcement/memorandum-understanding-sale-counterfeit-goods-internet  

(100)

https://ec.europa.eu/growth/industry/policy/intellectual-property/enforcement/memorandum-of-understanding-online-advertising-ipr  

(101)

  Directive 2006/123/EC of the European Parliament and of the Council of 12 December 2006 on services in the internal market

(102)

For instance, UberPop was considered not to be an information society service (C-434/15), but Airbnb is (C-390/18).

(103)

As pointed out by AG Jääskinen in his opinion in the eBay case (p. 139 ff): „As I have explained, ‘neutrality’ does not appear to be quite the right test under the directive for this question. Indeed, I would find it surreal that if eBay intervenes and guides the contents of listings in its system with various technical means, it would by that fact be deprived of the protection of Article 14 regarding storage of information uploaded by the users“

(104)

In case C-682/18 YouTube.

(105)

The relevant provisions have been recently interpreted by the Court, who has confirmed that a Member State’s failure to fulfil its obligation to give notification of a measure restricting the freedom to provide an information society service provided by an operator established on the territory of another Member State, laid down in the second indent of Article 3(4)(b) of Directive 2000/31, renders the measure unenforceable against individuals, in the same way as a Member State’s failure to notify the technical rules in accordance with Article 5(1) of Directive 2015/1535 (judgment of 19 December 2019, Airbnb Ireland (C‑390/18, EU:C:2019:1112, paragraph 96). This judgment clarifies the legal effect the prior notification obligation by stating that it constitutes not a simple requirement to provide information, but an essential procedural requirement, which justifies the unenforceability of non-notified measures. The fact that a non-notified measure restricting the freedom to provide information society services is unenforceable may also be relied on in a dispute between individuals.

(106)

While MS are broadly satisfied with the IMI tool, there is a consistent confusion as to which cooperation mechanism should be used for which purpose (see ANNEX 8).

(107)

2019 Survey, 2020 targeted questionnaire, discussion in the e-Commerce Expert Group in October 2019.

(108)

See Annex 13

(109)

For example, the Content Incident Protocol developed by the companies involved in the Global Internet Forum to Counter Terrorism, https://www.gifct.org/joint-tech-innovation/  

(110)

This does not imply an obligation to actively to seek facts or circumstances indicating illegal activity

(111)

The ECHR has indicated that it is “in line with the standards on international law” that ISSPs should not be held responsible for content emanating from third parties unless they failed to act expeditiously in removing or disabling access to it once they became aware of its illegality (see Tamiz v. the United Kingdom (dec.), no. 3877/14, 84, and Magyar Jeti, 67)

(112)

For all intermediaries, costs are equivalent with those in Option 2, apart from very large online platforms

(113)

Option 3 requires further obligations triggering higher costs than option 2 for a narrow population of very large online platforms; these are proportionate to the financial capacity of the very large companies generally captured by the scope of the definition.

(114)

See Annex 4

(115)

Using as benchmark private sector estimates of online cross-border trade https://www.cbcommerce.eu/press-releases/press-release-cross-border-commerce-europe-publishes-the-second-edition-of-the-top-500-cross-border-retail-europe-an-annual-ranking-of-the-best-500-european-cross-border-online-shops/  

(116)

 Supra, §180

(117)

See Annex 4

(118)

 Ibidem

(119)

Niombo Lomba, Tatjana Evas, Digital Services Act. European added value assessment, European Parliamentary Research Service, October 2020

(120)

See Annex 4

(121)

Cost models and benchmarks presented in Annex 4

(122)

Such as requests for help from microenterprises in the context of the EU Internet Forum, where their services were targeted by terrorist organisations to pivot the dissemination of content by sharing hyperlinks on their services.

(123)

Benchmarked against resources currently reported by DPAs, and estimating 0.5 FTE for investigators per 15 million users reached by a digital service hosted in the Member State, with efficiencies of scale accounted for

(124)

(LNE, forthcoming)

(125)

  Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC

(126)

Out of 362 service providers

(127)

See, for options on liability of online intermediaries in Annex 9, and for the use of proactive detection measures, supported by technical tools and automated decision systems, Annex 11

(128)

Cases C-70/10 (SABAM v Scarlet) and C 360/10 (SABAM v Netlog NV)

(129)

See Annex 11

(130)

United Nations Special Rapporteur on violence against women, thematic report on violence against women in politics: http://www.un.org/en/ga/search/view_doc.asp?symbol=A/73/301 .

(131)

Directive (EU) 2015/849 of the European Parliament and of the Council of 20 May 2015 on the prevention of the use of the financial system for the purposes of money laundering or terrorist financing

(132)

(European Commission, 2020)

(133)

Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector

(134)

(European Commission, 2020)

(135)

Waste Framework Directive 2008/98/EC

(136)

Screening against Tool 27 in the Better Regulation toolbox, and the Commission’s policy on Internet Governance ( COM (2014) 072 final )

(137)

For all digital services, costs are equivalent with those in Option 2

(138)

While Option 3 requires further obligations triggering higher costs than option 2, these are circumscribed to a narrow population of very large platforms; they are proportionate to the financial capacity of the very large companies generally captured by the scope of the definition.


Brussels, 15.12.2020

SWD(2020) 348 final

COMMISSION STAFF WORKING DOCUMENT

IMPACT ASSESSMENT REPORT

ANNEXES

Accompanying the document

PROPOSAL FOR A REGULATION OF THE EUROPEAN PARLIAMENT AND THE COUNCIL

on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC

{COM(2020) 825 final} - {SEC(2020) 432 final} - {SWD(2020) 349 final}


Annexes

   

Annexes    

Annex 1: Procedural information    

Annex 2: Stakeholder consultation    

Annex 3: Who is affected and how?    

Annex 4: Analytical methods    

Annex 5: Evaluation report for the E-Commerce Directive    

Annex 6: Supporting analysis for legal basis and drivers – legal fragmentation    

Annex 7: Regulatory coherence    

Annex 8: Cross-border cooperation    

Annex 9: Liability regime for online intermediaries    

Annex 10: Overview of voluntary measures and cooperation    

Annex 11: Content moderation tools    

Annex 12: Online advertising    

Annex 13: Overview of the European Parliament’s own initiative reports on the Digital Services Act    

Annex 1: Procedural information

1.Lead DG, Decide Planning/CWP references

This Staff Working Paper was prepared by the Directorate-General for Communications Networks, Content and Technology.

The Decide reference of this initiative is PLAN/2020/7444.

This includes the Impact Assessment report as well as, annexed to the report, the evaluation report for the E-Commerce Directive.

Organisation and timing

The Impact Assessment was prepared by DG CONNECT as the lead Directorate-General.

The Inter-Service Steering Group established for the work streams on online platforms was associated and consulted in the process, under the coordination of the Secretariat-General, including the following services: DG AGRI (DG for Agriculture and Rural Development), DG COMP (DG Competition), DG ECFIN (DG Economic and Financial Affairs), DG EMPL (DG Employment, Social Affairs and Inclusion), DG ENV (DG Environment, DG FISMA (DG for Financial Stability, Financial Services and Capital Markets Union), DG GROW (DG Internal Market, Industry, Entrepreneurship and SME), DG HOME (DG Migration and Home Affairs), DG JUST (DG Justice and Consumers), JRC (Joint Research Centre), DG MOVE (DG Mobility and Transport), DG RTD (DG Research and Innovation), DG REGIO (DG Regional and Urban Policy), SJ (Legal Service), DG SANTE (DG for Health and Food Safety), DG TRADE, EEAS (European External Action Service).

The last meeting of the ISSG, chaired by the Secretariat-General of the European Commission was held on 6 October 2020.

Consultation of the RSB

The Regulatory Scrutiny Board gave a positive opinion with reservation on the draft impact assessment report submitted on 8 October 2020 and discussed in the hearing that took place on 4 November 2020. To address the feedback given by the Regulatory Scrutiny Board, the following changes were made in the Impact Assessment report and its annexes:

Findings of the Board

Main modifications made in the report to address them

1.The report does not sufficiently explain the coherence between the Digital Services Act and the broader regulatory framework, in particular the relation to sectoral legislation and the role of self-regulation.

The report was amended to explain in more detail the coherence considerations, both in the problem statement section and in the coherence analysis for the options.

2.The policy options are not complete and not sufficiently developed. They lack detail and their content is not well explained.

The policy options were revised to give further details on each of them and their components. For option 3, governance sub-options were further explained. Further information was added on the threshold for the very large platforms both in the main report and in Annex 4

3.The report does not clearly present the evidence that leads to the choice of the preferred policy option. The assessment of compliance costs is insufficient.

Building on the additional specifications of the options, the analysis of impacts was further refined, including more granular presentation of costs. The presentation and analysis of the comparison of options was updated accordingly.

Stakeholder views

The main report and the annex present stakeholder views with more granularity

Evidence, sources and quality

Studies commissioned or supported by the European Commission

Dealroom. (2020). Global platforms and markerplaces. Report for the European Commission.

Eurobarometer - TNS. (2018, July). Flash Eurobarometer 469: Illegal content online. doi:10.2759/780040

Eurobarometer - TNS. (2016). Flash Eurobarometer 439: The Use of Online Marketplaces and Search Engines by SMEs. Retrieved from https://ec.europa.eu/information_society/newsroom/image/document/2016-24/fl_439_en_16137.pdf

ICF, Grimaldi, The Liability Regime and Notice-and-Action Procedures, SMART 2016/0039

LNE. (forthcoming). SMART 2018/37 Exploratory study on the governance and accountability of algorithmic systems

Optimity Advisors, SMART 2017/ 0055 Algorithmic Awareness building – State of the art report

Schwemer, S., Mahler, T. & Styri, H. (2020). Legal analysis of the intermediary service providers of non-hosting nature. Final report prepared for the European Commission

Van Hoboken J. et al., Hosting Intermediary Services and Illegal Content Online

Selective list of relevant case law

C‑18/18, Glawischnig. ECLI:EU:C:2019:821.

C-390/18 Airbnb Ireland

C-484/14, McFadden, ECLI:EU:C:2016:689.

CJEU -149/15, Sabrina Wathelet v Garage Bietheres & Fils SPRL

C-434/15 Asociación Profesional Elite Taxi v Uber Systems Spain SL

C-314/12, UPC Telekabel Wien, EU:C:2014:192.

C-360/10, SABAM, ECLI:EU:C:2012:85;

C-70/10 (SABAM v Scarlet)

C 360/10 (SABAM v Netlog NV)

C-324/09, L’Oreal v eBay, ECLI:EU:C:2011:474.

C-236/08 to C-238/08, Google France and Google v. Vuitton, ECLI:EU:C:2010:159.C-380/03 Germany v European Parliament and Council, judgment of 12 December 2006.

Joined Cases C-465/00, C-138/01 and C-139/01 Österreichischer Rundfunk and Others [2003] ECR I-4989,

C-101/01 Lindqvist [2003] ECR I-12971

ECHR, Application no. 24683/14 ROJ TV A/S against Denmark

French Supreme Court, 12 July 2012, no. 11-13.666, 11-15.165/11-15.188, 11-13.669

Hizb ut‑Tahrir and Others v. Germany, No. 31098/08

Kasymakhunov and Saybatalov v Russia, No. 26261/05 and 26377/06

ECHR, Application no. 56867/15 Buturugă against Romania, judgment of 11 February 2020

Gerechtshof Amsterdam, 24 June 2004, 1689/03 KG, Lycos gegen Pessers.

Zeran v AOL, 129 F.3d 327 (4th Cir. 1997).

Antwerp Civil Court, 3 December 2009, A&M, 2010, n.2010/5-6

President of the Brussels Court (NL), n 2011/6845/A, 2 April 2015

OLG Karlsruhe Urt. v. 14.12.2016 – 6 U 2/15

GRURRS 2016, 115437

Milan Court of Appeal, R.T.I. v. Yahoo! Italia, n. 29/2015;

Rome Court of Appeal, RTI v TMFT Enterprises LLC, judgment 8437/2016 of 27 April 2016

Turin Court of First instance, judgment 7 April 2017 No 1928, RG 38113/2013, Delta TV v Google and YouTube

Supreme Court of Hungary Pfv.20248/2015/9.

Supreme Court, OGH 6 Ob 178/04a.

Judgement of Appellate Court in Wroclaw of 15 January 2010, I Aca 1202/09.

Judgement of 15 April 2014, ECLI:NL:HR:2014:908 (interpretation Art. 54a Sr).

LG Leipzig, judgement of 19 May 2017 (05 O 661/15).

Selective bibliography

Alastair Reed, J. W. (2019). Radical Filter Bubbles. Social Media Personalisation Algorithms and Extremist Content. Global Research Network on Terrorism and Technology(Paper No. 8). Retrieved from https://www.rusi.org/sites/default/files/20190726_grntt_paper_08.pdf

Alrhmoun, A., Maher, S., & Winter, C. (2020). Decoding Hate: Using Experimental Text Analysis to Classify Terrorist Content. Global Network on Extremism and Technology. https://gnet-research.org/wp-content/uploads/2020/09/GNET-Report-Decoding-Hate-Using-Experimental-Text-Analysis-to-Classify-Terrorist-Content.pdf

Ali M., e. a. (2019). Discrimination through optimization: How Facebook’s ad delivery can lead to skewed outcomes. Proceedings of the ACM on Human-Computer Interaction. Retrieved from https://arxiv.org/pdf/1904.02095.pdf

Angelopoulos, C., Smet S. (2016) ‘Notice-and-Fair-Balance: How to Reach a Compromise between Fundamental Rights in European Intermediary Liability’ (October 21, 2016). In Journal of Media Law, Taylor & Francis. Retrieved from SSRN: https://ssrn.com/abstract=2944917

Artificial intelligence and the future of online content moderation. (2018, March 21). Freedom to Tinker — Research and expert commentary on digital technologies in public life. "https://freedom-to-tinker.com/2018/03/21/artificial-intelligence-and-the-future-of-online-content-moderation/

Bridy, A. (2019). The Price of Closing the 'Value Gap': How the Music Industry Hacked EU Copyright Reform . Forthcoming in Vanderbilt Journal of Entertainment & Technology Law, p. 115. http://dx.doi.org/10.2139/ssrn.3412249  

Bridy, A. (2017). Notice and Takedown in the Domain Name System: ICANN’s Ambivalent Drift into Online Content Regulation . Washington and Lee Law Review, 74(3), 1345–1388 https://scholarlycommons.law.wlu.edu/wlulr/vol74/iss3/3/ "

Cobbe J., Singh J. (2019) ‘Regulating Recommending: Motivations, Considerations, and Principles’, European Journal of Law and Technology, 10 (3)

Coppock, A. (2020, September 2). The small effects of political advertising are small regardless of context, message, sender, or receiver: Evidence from 59 real-time randomized. Science Advances, 6(36). doi:https://advances.sciencemag.org/content/6/36/eabc4046

Crabit, E. (2000). La directive sur le commerce électronique. le project "Méditrranée'. Revue du Droit de l'Union Européenne(4), 749-833.

Datta A., D. A. (2018). Discrimination in Online Advertising. A Multidisciplinary Inquiry. Proceedings of Machine Learning Research . 81, pp. 1-15. Conference on Fairness, Accountability, and Transparency. Retrieved from http://proceedings.mlr.press/v81/datta18a/datta18a.pdf  

De Streeel A.,, Husovec M. (2020) The e-commerce Directive as the cornerstone of the Internal Market - Assessment and options for reform, Study for the IMCO committee PE 648.797, retrieved from https://www.europarl.europa.eu/RegData/etudes/STUD/2020/648797/IPOL_STU(2020)648797_EN.pdf  

Duch-Brown N., Martens B. (2015). The European Digital Single Market. Its Role in Economic Activity in the EU. (I. f. Studies, Ed.) Digital Economy Working Paper(17). Retrieved from https://ec.europa.eu/jrc/sites/jrcsh/files/JRC98723.pdf

European Commission. (2018, September 12). SWD(2018) 408 final, Impact Assessment accompanying the document Proposal for a Regulation of the European Parliament and of the Council on preventing the dissemination of terrorist content online. Retrieved from https://ec.europa.eu/commission/sites/beta-political/files/soteu2018-preventing-terrorist-content-online-swd-408_en.pdf  

European Commission. (2019). Report on the EU customs enforcement of intellectual property rights: results at the EU border in 2018. Luxembourg: Publications Office of the European Union. Retrieved from https://ec.europa.eu/taxation_customs/sites/taxation/files/2019-ipr-report.pdf

European Commission. (2020). SWD(2020) 116 final/2 Commission Staff Working Document: Report on the functioning of the Memorandum of Understanding on the sale of counterfeit goods on the internet. Retrieved from https://ec.europa.eu/docsroom/documents/42701  

Feci, N. (2018). Gamers watching gamers: the AVMSD soon the one calling the shots?.

Floridi, L., & Taddeo, M. (2017). The Responsibilities of Online Service Providers. Springer

Gorwa, R., Binns, R., & Katzenbach, C. (2020). Algorithmic content moderation: Technical and political challenges in the automation of platform governance. Big Data & Society, 7(1), 2053951719897945.

Grygiel, J. (2019, July 24). Facebook algorithm changes suppressed journalism and meddled with democracy. The Conversation. HYPERLINK https://theconversation.com/facebook-algorithm-changes-suppressed-journalism-and-meddled-with-democracy-119446  

Hawkinson, J., & Bates, T. (1996). Guidelines for creation, selection, and registration of an Autonomous System (AS).

Hoeren, T., & Völkel, J. (2018). Information Retrieval About Domain Owners According to the GDPR . Datenschutz und Datensicherheit.  https://doi.org/10.2139/ssrn.3135280

How Facebook can flatten the curve of the coronavirus Infodemic. (2020). Avaaz. https://secure.avaaz.org/campaign/en/facebook_coronavirus_misinformation  

INHOPE. (2019). Annual Report 2018. Amsterdam: INHOPE Association. Retrieved from https://www.inhope.org/media/pages/the-facts/download-our-whitepapers/3976156299-1591885517/2019.12.13_ih_annual_report_digital.pdf

Iusi Li, J. C. (2016). Advertising Role of Recommender Systems in Electronic Marketplaces: A Boon or a Bane for Competing Sellers? Retrieved from https://ssrn.com/abstract=2835349 or http://dx.doi.org/

Keller, D. The Right Tools: Europe's Intermediary Liability Laws and the 2016 General Data Protection Regulation, 2017 http://cyberlaw.stanford.edu/blog/2017/04/%E2%80%9Cright-be-forgotten%E2%80%9D-and-national-laws-under-gdpr  

Kojo, M., Griner, J., & Shelby, Z. (2001). Performance enhancing proxies intended to mitigate link-related degradations.

Kuerbis, B., Mehta, I., & Mueller, M. (2017). In Search of Amoral Registrars: Content Regulation and Domain Name Policy. Internet Governance Project, Georgia Institute of Technology. https://www.internetgovernance.org/wp-content/uploads/AmoralReg-PAPER-final.pdf  

Landmark data sharing agreement to help safeguard victims of sexual abuse imagery. (2019, December "https://www.iwf.org.uk/news/landmark-data-sharing-agreement-to-help-safeguard-victims-of-sexual-abuse-imagery"

Law, Borders and Speech Conference: Proceedings and Materials, https://cyberlaw.stanford.edu/page/law-borders-and-speech  

Madiega, T. (2020). Reform of the EU liability regime for online intermediaries. Background on the forthcoming Digital Services Act. European Parliamentary Research Service.

Mandola Project, Best practice Guide for responding to Online Hate Speech for internet industry, http://mandola-project.eu/m/filer_public/29/10/29107377-7a03-432e-ae77-e6cbfa9b6835/mandola-d42_bpg_online_hate_speech_final_v1.pdf

Matias, M. P. (2020). Do Automated Legal Threats Reduce Freedom of Expression Online? Preliminary Results from a Natural Experiment. Retrieved from https://osf.io/nc7e2/

Moura, G. C., Wabeke, T., Hesselman, C., Groeneweg, M., & van Spaandonk, C. (2020). Coronavirus and DNS: view from the. nl ccTLD.

Iacob N. et al.(2020). How to Fully Reap the Benefits of the Internal Market for E-Commerce?, Study for the committee on the Internal Market and Consumer Protection, Policy Department for Economic, Scientific and Quality of Life Policies. Retrieved from https://www.europarl.europa.eu/RegData/etudes/STUD/2020/648801/IPOL_STU(2020)648801_EN.pdf

National Research Council. (2005). Signposts in cyberspace: the Domain Name System and internet navigation. National Academies Press

Niombo L., Evas T. Digital services act - European added value assessment, Study for the European Parliamentary Research Service PE 654.180. Retrieved from https://www.europarl.europa.eu/RegData/etudes/STUD/2020/654180/EPRS_STU(2020)654180_EN.pdf.

Nordemann, J. B. (2018). Liability of Online Service Providers for Copyrighted Content–Regulatory Action Needed. Depth Analysis for the IMCO Committee.

Nordemann J. B. (2020), The functioning of the Internal Market for Digital Services: responsibilities and duties of care of providers of Digital Services, Study for the IMCO committee, PE 648.802. Retrieved from https://www.europarl.europa.eu/RegData/etudes/STUD/2020/648802/IPOL_STU(2020)648802_EN.pdf.

OECD/EUIPO. (2019). Trends in Trade in Counterfeit and Pirated Goods, Illicit Trade. Paris: OECD Publishing/ European Union Intellectual Property Office. Retrieved from https://euipo.europa.eu/tunnel-web/secure/webdav/guest/document_library/observatory/documents/reports/trends_in_trade_in_counterfeit_and_pirated_goods/trends_in_trade_in_counterfeit_and_pirated_goods_en.pdf

OECD. (2020). Connecting Businesses and Consumers During COVID-19 Through Cross-Border Trade In Parcels. https://read.oecd-ilibrary.org/view/?ref=135_135520-5u04ajecfy&title=Connecting-Businesses-and-Consumers-During-COVID-19-Trade-in-Parcels

Penney, J. (2019, September 1). Privacy and Legal Automation: the DMCA as a Case Study. Stanford Technology Law Review, 22(1), 412-486. Retrieved from https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3504247

Piper, D. L. A. (2009). EU study on the legal analysis of a single market for the information society–new rules for a new age?. DLA Piper, November.

Reale, M., ‘Digital Markets, Bloggers, and Trendsetters: The New World of Advertising Law’ in MDPI, 3 September 2019, P. 9.

Rosenzweig, P. (2020). The Law and Policy of Client-Side Scanning (Originally published by Lawfare).

Schulte-Nölke H. et al, (2020), The legal framework for e-commerce in the Internal Market - State of play, remaining obstacles to the free movement of digital services and ways to improve the current situation, Study for IMCO committee PE 652.707. Retrieved from https://www.europarl.europa.eu/RegData/etudes/STUD/2020/652707/IPOL_STU(2020)652707_EN.pdf.

Schwemer, S.F. (2018). On domain registries and unlawful website content. Computer Law & Security Review

Schwemer, S.F. (2020). Report on the workshop on the liability of DNS service providers under the ECD, Prepared for Directorate-General for Communications Networks, Content and Technology (Unit Next-Generation Internet, E.3Schwemer, S., Mahler, T. & Styri, H. (2020). Legal analysis of the intermediary service providers of non-hosting nature. Final report prepared for the European Commission

Sluijs, J. et al. (2012). Cloud Computing in the EU Policy Sphere, JIPITEC, 12, N 80.Smith M. (2020), Enforcement and cooperation between Member States - E-Commerce and the future Digital Services Act, Study fot IMCO committee, PE 648.780. Retrieved from https://www.europarl.europa.eu/RegData/etudes/STUD/2020/648780/IPOL_STU(2020)648780_EN.pdf?utm_source=EURACTIV&utm_campaign=247d4049f5-digital_brief_COPY_01&utm_medium=email&utm_term=0_c59e2fd7a9-247d4049f5-116254339  

Sohnemann N., Uffrecht L.M, Constanzehartkopf M, Kruse J. P., De Noellen L. M., New Developments in Digital Services Short-(2021), medium-(2025) and long-term (2030) perspectives and the implications for the Digital Services Act, Study for IMCO committee, PE 648.784. Retrieved from https://www.europarl.europa.eu/RegData/etudes/STUD/2020/648784/IPOL_STU(2020)648784_EN.pdf

Stalla-Bourdillon, S. (2017). Internet Intermediaries as Responsible Actors? Why It Is Time to Rethink the E-Commerce Directive as Well. In The Responsibilities of Online Service Providers (pp. 275-293). Springer, Cham.

Stephan Lewandowsky, L. S. (2020 (forthcoming)). Technology and democracy: Understanding the influence of online technologies on political behaviour and decision-making.

Tech Against Terrorism (2019), Case study: Using the GIFCT hash-sharing database on small tech platforms https://www.counterextremism.com/sites/default/files/TAT%20--%20JustPaste.it%20GIFCT%20hash-sharing%20Case%20study.pdf  

Truyens, M., & van Eecke, P. (2016). Liability of Domain Name Registries: Don’t Shoot the Messenger. Computer Law & Security Review, 32(2), 327–344.

Urban, J. e. (2017, March). Notice and takedown in everyday practice. UC Berkeley Public Law Research Paper No. 2755628. Retrieved from https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2755628

Van Hoboken, J., Appelman, N., Ó Fathaigh, R., Leerssen, P., McGonagle, T., van Eijk, N., & Helberger, N. (2019). De verspreiding van desinformatie via internetdiensten en de regulering van politieke advertenties: Tussenrapportage (Oktober 2019).

Van Hoboken, J. and coll. (2018). Hosting intermediary services and illegal content online: An analysis of the scope of Article 14 ECD in light of developments in the online service landscape. Final report prepared for the European Commission.

Wagner B., Rozgonyi K. et al. (2020). Regulating Transparency? Facebook, Twitter and the German Network Enforcement Act 

Weber, R.H. & Staiger, D.N. (2014). Cloud Computing: A cluster of complex liability issues, 20(1) Web JCL., http://webjcli.org/article/view/303/418  

Wilman, F., The responsibility of online intermediaries for illegal user content in the EU and in the US, 2020 (forthcoming)

Annex 2: Stakeholder consultation

1.THE STAKEHOLDERS ENGAGEMENT STRATEGY

The Commission has consulted broadly on issues related to digital services and platforms in the last years. The consultation process around these issues is built on recent and past consultation steps, which have already narrowed down the spectrum of pertinent options and have singled out specific issues.

A series of meetings and a consultation on the inception impact assessment published on the 2 June 2020 informed the problem definition and led to preliminary policy options. An open public consultation, open between June 2020 and September 2020, also contributed to the design and testing of policy options. For gathering the views of the general public, the Commission also ran a Eurobarometer survey in 2018 with a representative sample of over 33,000 respondents form all EU Member States.

Targeted consultations have also been conducted over the past years, including a series of workshops, conferences, interviews with experts and judges, expert groups, as well as a long-list of bilateral meetings and the reception of position papers and analytical papers from organizations, industry representatives, civil society and academia.

In developing the stakeholder engagement strategy, the stakeholder mapping included:

1.Private sector: capturing views of businesses of different sizes and reach within the European market. The private sector includes but is not limited to information society services. Businesses and associations representing their interests primarily pertain to the following categories:

a)Online intermediaries including, but not limited to, internet service providers, caching services, storage and distribution services (e.g. web hosting, online media sharing platforms, file storage and sharing, IaaS/PaaS), networking, collaborative production and matchmaking services (e.g. social networking and discussion forums, collaborative production, online marketplaces, collaborative economy, online games), and selection, search and referencing services (e.g. search tools, ratings and reviews services).

b)Other digital services which are not online intermediaries: e.g. website owners, private bloggers, private e-tailers, etc.

c)Third parties involved in the ecosystem around digital services including, but not limited to, advertising providers, providers of content moderation tools, providers of payment services, data brokers, other services built as ancillary to online platforms, or primarily based on data accessed from the online platforms, other interested parties such as content creators, rights holders, etc.

d)Offline and online services that provide their services through online intermediaries, such as retailers on marketplaces, app developers, publishers, hotel owners, etc.

e)Innovative start-ups and associations representing start-ups pertaining to the categories above.

f)Trade and business associations representing the different interests of the businesses in the above categories.

2.Users of digital services, as well as civil society organisations representing their interests in terms of e.g. digital rights, interests of vulnerable groups and victims of online crimes.

3.National authorities including law enforcement, data protection and consumer protection authorities, and other relevant regulatory bodies and government departments in member states and, to the extent possible, in regions and municipalities.

4.Academia from the technical, legal and social science communities.

5.Technical community such as the Internet Architecture Board, ICANN, Internet Engineering Task Force, etc.

6.International organisations dealing with the issues at stake at different governance levels e.g. the UN, the Council of Europe, OSCE.

7.General public, in particular through a dedicated section in the open public consultation. Representative statistics on certain aspects have been computed on the basis of the Eurobarometer survey of 2018.

The different consultation tools as well as brief summaries of their results are described below.

2.OPEN PUBLIC CONSULTATIONS

The Commission has conducted several open public consultations on the related issues (i) in 2010 in the context of the evaluation of the e-Commerce Directive (ECD) 1 ; (ii) in 2012, with a focus on notice-and-action procedures for all types of illegal content 2 ; (iii) in 2016, part of the broader open public consultation on online platforms 3 ; (iv) in 2018 on measures to further improve the effectiveness of the fight against illegal content online and finally 4 ; (v) in 2020 in the context of the Digital Services Act Package.



2.1.Open Public Consultation on the Digital Services Act (2 June 2020 – 8 September 2020) 5

In total, 2,863 responses were submitted by a diverse group of stakeholders. Most feedback was received by citizens (66% from EU citizens, 8% from non-EU citizens), companies/ businesses organizations (7.4%), business associations (6%), and NGOs (5.6%). This was followed by public authorities (2.2%), others (1.9%), academic/research institutions (1.2%), trade unions (0.9%), as well as consumer and environmental organisations (0.4%) and several international organisations. Additionally, around 300 position papers were received in the context of the open public consultation.

The organisation SumOfUs organised a campaign with a parallel and more general questionnaire on citizens’ concerns related to online platforms, gathering around 738 replies mostly from UK (56%), FR (10%) and DE (8%). Most contributions centred on the rising problems surrounding fake news and hate speech online. Respondents jointly called for action, but also formulated concerns regarding free speech.

Figure 1: Type of Respondent

In terms of geographical distribution, most of the respondents are located in the EU, with a majority of contributions coming from Germany (27.8%), France (14.3%), and Belgium (9.3%). Internationally, the highest share of respondents that participated were from the UK (20.6%) and the US (2.8%) 6 .

Figure 2: Country of Origin of Respondents

Zoom-in: the three largest respondent groups

Companies/Businesses organizations and business associations

Of the 211 participating companies/business organizations, 80.1% specified that they were established in the EU and 11.4% indicated that they were established outside of the EU.

26.5% described themselves as a conglomerate, offering a wide range of services online. 21.3% identified as a scale-up and 6.6% as a start-up. In terms of annual turnover, more than half of the participating companies/business organizations indicated a turnover of over EUR 50 million per year. 13.3% make an annual turnover of smaller than or equal to EUR 2 million, 3.8% of the respondent revealed an annual turnover of smaller than or equal to EUR 10 Mio, whereas 6.2% specified an annual turnover of smaller than or equal to EUR 50 Mio. 28.4% of the responding companies/business organizations were online intermediaries, 24.6% were other types of digital services. 12.3% indicated that they were an association, representing the interest of the types of businesses named prior. Of the 180 participating business associations, 15% indicated that they were representing online intermediaries, 19.4% specified that they are working on behalf of digital service providers other than online intermediaries, and 40% indicated that they represented the interests of other businesses.



Figure 3: Type of Services provided by responding companies/business organisations

Figure 4: Intermediated Services by responding platforms

NGOs

Of the 159 participating NGOs, almost half (49.7%) stated, that they represented fundamental rights in the digital environment. 22.6% dealt with flagging illegal activities or information to online intermediaries for removal, and 22% represented consumer rights in the digital environment. Furthermore, 18.9% specified that they were fact checking and/or cooperating with online platforms for tackling harmful, (but not illegal) behaviours and 13.2% represented the rights of victims of illegal activities online. 10.7% represented interests of providers of services intermediated by online platforms, including trade unions, and 10.7% gave no answer. 30.8% of the responding NGOs indicated “other”.

Public authorities

59 public authorities participated in the open public consultation, of which 43 representing authorities at national level (72.9%), 8 at regional level (13.6%), 6 at international level (10.2%), and 2 at local level (3.4%). Among EU Member States, authorities replied from Austria, Belgium, the Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Ireland, Italy, Latvia, Luxembourg, and Poland. About half of the responding public authorities were governments, administrative or other public authorities other than law enforcement in a member state of the EU (49.2%). 15.3% indicated that they were a law enforcement authority in a Member State of the EU and 15.3% specified that they were another independent authority in a member state of the EU. These replies are complemented by a targeted consultation ran by the Commission with Member States.

Figure 6: Type of responding public authority

Summary of Results

Exposure to illegal content, goods or services online and related issues

A majority of respondents, all categories included, indicated that they had encountered both harmful and illegal content, goods or services online, and specifically noted a spike during the Covid-19 pandemic More specifically, 47% of respondents who replied to the relevant question indicated, that they had come across illegal goods, on online platforms at least once, as shown in Figure 1 7 . 67% stated that they had encountered illegal content online. The main issues reported by the respondents in relation to goods are deceptive advertising especially in relation to food, food supplements, drugs and COVID-19; advertising on pet and wildlife trafficking; counterfeit and defective or stolen goods, electronics and clothes. Regarding services, the main issues raised by the respondents are fake event tickets or cases in which platforms illegally re-sell tickets and inflate their prices, cryptocurrencies and trading online, as well as general cases of phishing. Finally, in relation to content, the respondents report significant issues related to hate speech (racism, anti-Semitism, white supremacy, calls for violence against migrants and refugees, extremism, far-right propaganda, homophobia, sexism, defamation); general incitement to violence; unwanted pornography and prostitution ads; child sexual abuse material; IP infringement for movies, copyrighted content, political disinformation and fake news.

A large share of respondents who said they had notified illegal content, goods or services to platforms, expressed their dissatisfaction with the platforms’ response, and the ineffectiveness of reporting mechanisms after the exposure took place. Furthermore, the majority of users, who replied to the relevant questions, were not satisfied with the actions that platforms take to minimise risks e.g. when consumers are exposed to scams and other unfair practices online. They mostly consider that platforms are not doing enough to prevent these issues from happening. In general, these users perceive a difference between the official positions of platforms on what they do and what they actually do.

In addition, several concerns arose in relation to the reporting of illegal goods/content/services. For the majority of the users, reporting is not simple both in terms of easiness of finding the procedure for reporting and in terms of easiness of use of the reporting procedure. Moreover, 54% of the respondents are not satisfied with the procedure following the reporting, are not aware of any action taken by the platform as a follow up on their reporting and consider that there is a lack of transparency following a notification. 8 In addition, users point out that the notice and action procedures are very different from one platform to another, making the procedure of reporting illegal content/goods/services even more difficult and uncertain. In this regard, consumer protection authorities have highlighted their struggle with effective enforcement when the sellers are not established in the EU.

Respondents from all categories consider that during the COVID-19 crisis they have witnessed the dissemination of misleading information on the causes of the outbreak, the treatment, and vaccines. They also point out a general increase in hate speech, price gouging, fake news and political misinformation, in addition to a significant number of illegal goods available online and scams connected to the emergency including phishing, business email compromise, malware distribution, scams, and many other types of attacks, ranging from fake web shops, credit card skimming and illicit pharmacies to ransomware. The general public has particularly praised the fact that the WHO and in general the scientific community had partnered with tech companies to promote accurate information about COVID-19.

Respondents have stated that they use an array of different systems 9 for detecting and removing illegal content, which also include, in addition to the notice-and-action systems (18% of respondents, i.e. 65 out of 362), automated systems (12% of respondents, i.e. 45 out of 362), systems for penalising repeated offenders (12% of respondents, i.e. 45 out of 362), and collaborations with the authorities and trusted organizations (11% of respondents, i.e. 40 out of 362). Only 9 out of the 362 respondents (i.e. 2.5%) do not have any system in place for addressing illegal activities conducted by the users of their service such as sale of illegal goods (e.g. a counterfeit product, an unsafe product, prohibited and restricted goods, wildlife and pet trafficking), dissemination of illegal content or illegal provision of services.

Exposure to harmful behaviours and related issues

Respondents from all categories have pointed out several issues in relation to harmful behaviours online, which is considered an opaque term, creating legal uncertainty. With regards to exposure to harmful content, the general public has mentioned the issue of deceptive and misleading ads, also in relation to minors and political advertising.

Publishers, companies that sell products or services online, the general public, as well as digital users’ and consumers’ associations expressed concerns about the lack of transparency and accountability regarding how targeted advertising and algorithmic systems shape online content. Furthermore, the limited disclosure of ad content and the lack of ad targeting policy enforcement was flagged.

Political disinformation is seen as a widespread issue among all categories of stakeholders. Among the measures proposed to tackle this issue, the respondents mention more transparency with regards to political advertising, flagging of disinformation and enhanced data sharing with researchers and digital rights’ associations. While respondents are worried about the negative impact of political disinformation and fake news, the view of several digital users’ associations as well as of news publishers is that it is important that restrictions to free speech are strictly limited to what is necessary and proportionate.

Among respondents, there is a general consensus that children are not adequately protected online. Respondents make reference to online grooming and bullying, disinformation, possible manipulation through deceptive ads targeting minors, violent content, deceptive paid add-ons on video games, among other issues.

With regards to measures against activities that might be harmful but are not in themselves illegal, the highest share of respondents (17%, i.e. 53 of 314) replied that their terms and conditions and/or terms of service ban activities that are harmful, 16% ban hatred, violence and insults other than illegal hate speech (i.e. 51 of 314) and 14% ban harmful content for children (i.e. 43 of 314), etc.

Opportunities and risks of automated tools for tackling illegal or harmful content/goods/services

The issue of the use of automated tools to automatically detect illegal content, services and goods is considered very controversial among respondents. On the one hand, among content creators and brand owners there is a general support for the use of automated tools, but they also state that hosting services should be subject to transparency requirements and mostly supported by manual/human review. Several respondents pointed to the usefulness of such tools for addressing illegal content at scale, but there is also a strong call for caution in the use of such tools for a series of risks to over-removal of legal content.

Research institutes, associations representing companies selling through platforms and digital rights’ associations discourage the use of automated detection and restrictions, pointing out the risks of taking down legal content and that the technology used is still imperfect. In particular, digital rights’ associations consider that the risks for fundamental rights such as freedom of expression, discriminatory outcomes, privacy and freedom to conduct business still outweigh possible advantages for countering illegal content or activity online.

Overall, online intermediaries consider that they should not generally be asked to police and remove content unless a specific report for an individual piece of content is received. Otherwise, online intermediaries will, where they are available, need to rely on automated tools and technologies that may not be fit for purpose or fully developed, resulting in a vast number of false positives and over-blocking. Smaller platforms also point out that automated tools are also very costly to develop and maintain. They state that, while automated tools offer promise for content moderation at scale in the future, it is important to understand that developing, implementing, and iterating effective tools requires significant resources and machine learning capabilities, which may be out of reach for start-ups and scale-ups with ambitions to compete with larger players on the market.

Content amplification and information flows on online platforms

Respondents, in particular among academics and civil society, point however to the particular role of algorithmic systems on online platforms to shape access to online content and play a prominent role in the way platforms’ systems are used for reaching very wide audiences with content that might be inciting violence, hate speech or disinformation. Several stakeholders, amongst them citizens, civil rights associations, NGO’s, academic institutions as well as media companies and telecommunication operators pointed out the need for algorithmic accountability and transparency audits, especially with regards to how content is prioritized and targeted. In addition, especially in the context of addressing the spread of disinformation online, regulatory oversight and auditing competence over platforms’ actions and risk assessments was considered as crucial (76% of all stakeholders responding to the relevant question).

Risks for freedom of expression

Only 3% of the 1.208 respondents to the relevant question stated that they were informed by the platform before their content/goods/services were removed or access to it disabled. Most of them were not able to follow-up on the information.

In addition, the vast majority of users were not informed after they provided a notice to a digital service asking for the removal or disabling of access to contents/goods/services (only 9% were informed, 23% were informed in some occasions and 37% were not informed at all).

There is a perceived lack of transparency by digital users with regards to what violates the rules of the portal and in particular the “Community Guidelines”, with potential risks for freedom of expression.

In order to protect the freedom of expression of their users, several measures that service providers should take have been rated as essential by the majority of respondents, such as: high standards of transparency on their terms of service and removal decisions (84%, i.e. 1700 of 2035 replies), maintaining an effective complaint and redress mechanism (83%, i.e. 1681 of 2030 replies), diligence in assessing the content notified to them for removal or blocking (82%, i.e. 1653 of 2012 replies), high accuracy and diligent control mechanisms, including human oversight, when automated tools are deployed for detecting, removing or demoting content or suspending users’ accounts (82%, i.e. 1649 of 2011 replies), diligence in informing users whose content/goods/services was removed or blocked or whose accounts are threatened to be suspended (76%, i.e. 1518 of 2007 replies), and enabling third party insight – e.g. by academics – of main content moderation systems (56%, i.e. 1121 of 1993 replies).

Differentiating across categories of respondents, the only category of stakeholders which do not appear concerned about possible content over-blocking are the creative industry and brand owners, which consider that the percentage of false positives (that is, content which is wrongly identified as illegal) is very low.

Public service media make reference to the need to establish safeguards to prevent platforms from applying additional or secondary control over content published by these independent providers. Similarly, news publishers consider that they should not be subject to any editorial moderation by platforms, as to preserve media pluralism and the freedom of the press. This argument is also supported by trade associations that argue that content published under editorial responsibility should not be removed without a court order.

Respondents from several stakeholders’ categories point out the need for platforms to have a clear and transparent redress mechanism. Digital users’ associations point out that the users have no way to appeal to anyone independent or neutral and “allowing the platforms to police their own decisions does not seem to work in these situations as there is overwhelming evidence of their bias.”

The E-Commerce Directive

There is a very broad convergence towards the continued relevant of the E-Commerce Directive. The respondents point to several issues that could be included in a possible revision of the e-Commerce Directive, with platforms and trade associations emphasising the need to focus the regulatory attention at specific actions and perceived market failures. The three main general issues raised by stakeholders relate to the harmonisation of the notice and action procedures, the clarification of several terms in the Directive and the clarification of the scope of the liability safe harbour:

1.A stronger harmonisation at EU level of the notice and action process and timeframes would contribute to a more rapid response to illegal content online and enhance legal certainty for all stakeholder categories. Such a procedure should be easy to access and use, and it should be defined in EU law in order to overcome the existing divergence among Member States that makes it difficult, especially for small and medium-sized companies to offer their service in the whole single market.

2.Terms which are considered to require a clarification are: taking action ‘expeditiously’, ‘actual knowledge’ and ‘harmful’ content. In particular, platforms stress the need to clarify and harmonize the definitions of illegal and harmful content among different Member States.

3.Revised definition of passive and active hosts: it is argued that there is a lack of clarity of what type of providers qualify for the liability safe harbour and which degree of ‘active’ involvement with third party content is required for the liability safe harbour to be no longer available.

Regarding the territorial scope, respondents from all stakeholder categories generally agree with the fact that there is a need for an expansion of the obligations as regards any good/content/service offered to EU consumers, regardless of the place of establishment of the seller or the platforms. Yet, some national authorities clarify that this should be done in accordance with international law, and especially the commitments undertaken at the WTO level. Several stakeholders among telecom operators, digital rights’ and consumers’ associations as well as representatives of the creative industry and national authorities make reference to a potential requirement for undertakings in third countries to have a legal representative in the EU.

The country of origin principle is often cited by platforms and trade associations as an important principle to ensure legal certainty and provide clarity on the rules that govern the companies. They also consider that full harmonization of rules among Member States would be the best scenario. At the same time, some national authorities have pointed out that the effectiveness of this principle has been limited in practice and it is complex for the authority of the country of destination to intervene if the authority of the country of establishment fails to comply with its obligations. This is considered an important issue to be addressed to avoid forum shopping.

Among the 988 respondents who replied to the relevant question, 83% consider that there is a need to ensure similar supervision of digital services established outside the EU when they provide services to the EU users 10 .

There is a general agreement among respondents that ‘harmful content’ should not be defined and addressed in the Digital Services Act, as this is a delicate area with severe implications on the protection of free speech. Although respondents are divided on this issue, most of them consider that platforms cannot be trusted to strike, with their internal practices alone, the balance between democratic integrity, pluralism, non-discrimination, freedom of speech and other relevant issues at stake in relation to potential harmful content.

New obligations for online platforms

There is broad consensus among respondents on the need to harmonise at EU level obligations for online platforms to address illegal content they host. Among the most widely supported measures, including by online platforms, are simple, standardised, and transparent notice and action obligations harmonised across the single market. A large majority of stakeholders want all platforms to be transparent about their content policies, support notice and action mechanisms for reporting illegal activities, and request professional users to identify themselves clearly (90%, 85% and 86% respectively). There is a general call, especially among citizens, for establishing more transparency in the content moderation processes and outcomes.

Stakeholders disagree on the information which should be required for reporting illegal activities and information. On one hand, the creative industry considers that it would be, for example, extremely time-consuming to provide all URLs, while large platforms consider that they should be provided with precise information to retrieve the content concerned, including URLs, an explanation of why the content is considered unlawful, related laws that the content violates, and other supporting evidence.

Regarding the removal of illegal content, some research institutions, representatives of the internet tech community, telecom operators and digital rights’ associations point out the need to remove the alleged illegal content as close to its source as possible, and to require the intervention of internet infrastructure services only as a last resort option. Some digital rights’ associations refer to unduly short timeframes for removing content and compliance target to pose threats to freedom of expression.

Representatives of the creative industry have expressed the need to introduce ‘stay down’ obligations. In addition, they consider necessary the introduction of ‘repeat infringer’ policies and clear and fast-track procedures for ‘trusted flaggers.’

Yet, such obligation creates significant concerns among digital rights’ associations and platforms both in terms of legal uncertainty and risks on privacy and freedom of expression. Such monitoring is also considered a significant barrier for small platforms, which would not be able to develop the necessary capabilities. National authorities are generally against a monitoring requirement, but several of them consider the need to reassess the balance between rights, obligations and responsibilities of online intermediaries, given the exceptional influence of certain platforms in today’s economy. Most of the digital rights’ associations are strongly opposed to such measures. Some respondents consider that the ‘stay down’ obligation should be an exceptional measure, that can only be granted by a court in an injunction that is specific in terms of content and length of the measure, that must abide by the principle of proportionality and that should only apply to content that is identical. Platforms, research institutions and digital rights’ associations consider that the issue of reappearing illegal content is delicate and can cause serious threats to freedom of expression as, for example, an ‘identical’ post may be unlawful if repeated in the same context and lawful in a different context.

Some large platforms also suggest that independent third-party experts can play a key role in addressing the challenge of tackling illegal online content and suggest that, when it comes to the concept of ‘trusted flaggers’, it is essential to have a clear definition of their roles, obligations and responsibilities. Telecom operators and national authorities also consider the need to strengthen and harmonise the use of reliable notifiers, such as trusted flaggers. Several digital rights’ associations appear highly concerned with delegating to platforms the development of human rights and due diligence safeguards, and some of them have put forward detailed governance proposals to tackle this issue.

In response to specific questions concerning online marketplaces, some respondents including consumers’ associations, trade associations, national authorities and online sellers flagged the need for further accountability, irrespective of the size of the marketplace. Among the specific requirements suggested, there is the need to verify the sellers (‘Know-Your-Business-Customer’), to inform consumers who purchased a fake product; to enforce efficient measures to tackle repeat infringers (blacklisting sellers); to implement a proactive risk management system; to offer a clear set of transparency and reporting obligations; to consult information on recalled and dangerous products on RAPEX; and some also pointed to the implementation of proactive measure to prevent illegal products from reaching the platform’s website.

Transparency obligations have been widely suggested by all categories of stakeholders. It is argued that platforms should be clearer about how they address illegal content/goods/services. In particular, it is argued that more transparency is needed regarding the platforms terms of services’ violation given that the vast majority of content (96% according to one association) is deleted as a result of violations of these terms. More procedural accountability would enable interventions on content to be more precise and effective. Transparency reports are suggested by all stakeholder groups as a means to respond to the perceived lack of transparency on moderation of content and to create more accountability for platforms.

More transparency is also considered necessary with regards to how content is prioritized and targeted, and several digital rights’ associations, research institutions, national authorities, representatives of the creative industry and other companies have pointed out the need for algorithmic accountability and transparency audits. According to several digital rights’ associations, these audits would also require the sharing of data for public-interest research by the civil society and academia. Several digital rights’ associations also argued that users should have more control over the content they interact with, the use of their personal data by platforms and they should be able to decide not to receive any algorithmically curated content at all. One digital rights’ association suggested that the personalised content recommendation systems should work with an ‘opt-in’ system rather than the current default ‘opt out’. Digital rights’ associations also consider that extremely detailed profiling leads to strong personalisation of content, which impacts users’ right to freedom of expression and information as well as media pluralism.

On the other hand, while platforms acknowledge the possibility for more transparency, they also warn against possible implications in terms of compromising commercially-sensitive information (including their trade secrets), violations of privacy or data disclosure laws, and abuse from actors that could game their systems.

Regarding online advertising, more transparency is considered necessary on the identity of the advertiser, on how the content is targeted and personalised, and on the actions taken to minimise the diffusion of illegal content/goods/services. Efforts to implement features that explain why certain ads are shown to users, and the creation of ad libraries are considered good practices to build on. Political advertising and micro targeting is considered to raise specific and urgent challenges, including in relation to individuals’ autonomy and deliberation. Some respondents flagged that wide-transparency requirements are necessary for these evolving issues, as well as a capability for detecting risk and harms.

A large share of the general public responding to the consultation pointed to deceptive and misleading advertisements as being a major concern in their online experience. Users, academic institutions and civil society organisations are particularly concerned about targeted advertisements to minors and political advertising.

Academic institutions pointed to persistent difficulties when conducting research, and explained the difficulty of observing emerging issues and phenomena online, blaming an inconsistent access to relevant data. Several pointed to the need for a generally disclosed ad archive, as well as an independent auditing of ad systems.

Digital rights’ associations consider that users should have the rights to opt out of micro-targeting and that it could be prohibited for advertisers to target users with content based on very sensitive personal data like psychological profiles, political opinions, sexual orientations, or health status.

Regarding minors, digital rights’ associations and international organizations suggest to conduct child impact assessments, mitigate risks for minors ‘by design’, implement age verification systems, and focus on educational programs.

Moreover, whilst there is a strong call for action, many categories of stakeholders, including citizens, online intermediaries, civil society organisations, academic institutions, NGO’s and national authorities emphasized that any new measure to tackle illegal content, goods or services online, should not lead to unintentional, unjustified limitations on citizens’ freedom of expression or fundamental rights to personal data and privacy.

At the same time, most stakeholder groups acknowledged that not all types of legal obligations should be put on all types of platforms. According to various stakeholder groups, especially business organisations and start-ups, enhanced obligations are especially needed for larger platforms, but these obligations might be disproportionate for smaller ones. Start-ups especially stressed the point that a “one-size-fits-all” approach would be most beneficial for very large platforms, but could have detrimental effects on medium-sized or smaller platforms and businesses at the core of the European digital ecosystem. They stress that their growth and evolution should not be hindered by disproportionate rules.

Respondents also generally agree that the territorial scope for these obligations should include all players offering goods, content or services, regardless of their place of establishment.

Cooperation with trusted flaggers and authorities

Cooperation with civil society and other third parties such as trusted flaggers is considered an important means for improving the oversight over platforms. Some digital users’ associations caution that there should be clear roles and obligations, also to avoid shifting responsibility from platforms to third parties. Some research institutes also caution that there should not be voluntary agreements centred around trusted flaggers as this concept is still not clear and these entities might lack high standards of due process.

Regarding national authorities, several stakeholders acknowledge the need to share data with these authorities for oversight. However, some digital rights’ associations, platforms and news publishers caution that law enforcement authorities should not send requests outside the appropriate legal framework involving judicial authorities. The general public is also concerned about mandated sharing of data with the public authorities and ask for platforms to only be mandated to share data based on specific law enforcement requests in accordance with the countries’ laws. In general, it is argued that there is a need for transparency on supervisory and enforcement activity of authorities.

There was also a broad convergence among all stakeholder categories around the need to preserve the prohibition of general monitoring obligations for online intermediaries in order to preserve a fair balance and protect fundamental rights, including the right to privacy and freedom of expression.

Proposed changes to the current liability regime 

On the topic of the liability of intermediaries, a large majority of stakeholder groups broadly considered the principle of the conditional exemption from liability as a precondition for a fair balance between protecting fundamental rights online and preserving the ability of newcomers to innovate and scale. With regards to consumer protection, some organisations defending consumer rights supported changes to the liability regime in support of a faster resolution of damages for consumers. .

Some intermediaries, national authorities, research institutes and civil society organisations consider that the current regime creates disincentives to act and call for the removal of disincentives for voluntary measures, in order to limit the risks of liability for intermediaries that voluntarily implement preventative measures to detect illegal content. Yet, some digital users’ associations, trade associations and representatives of the creative industry warn against such a clause that is expected to weaken the responsibilities of intermediaries without additional positive obligations.

In particular representatives of smaller service providers, but also some civil society organizations pointed to legal uncertainty and disincentives for service providers to act against illegal goods, services or content disseminated through their service. Start-ups strongly called for a legislative framework that reaffirms the principles of the e-Commerce Directive, while supporting the introduction of a clarification of the liability regime with regards to voluntary measures they might take.

The distinction between passive and active players is considered to be still relevant and valid by some respondents (mainly telecom operators), whereas some other respondents from different stakeholder categories consider the need to move towards the use of other concepts, such as the degree of control over the content and a well-defined concept of actual knowledge. Telecom operators nevertheless also consider that there should be a clarification on the definition and responsibilities of active and passive hosts, following the recent jurisprudence of the CJEU. In addition, they consider that the regulatory focus should be directed to those hosting services that play an ‘active’ role. Some large platforms argue that a strict interpretation of what ‘passive’ hosts are would discourage online intermediaries from exploring innovative and personalised user experience, in addition to deterring them from taking voluntary proactive steps to identify and remove unlawful content.

Cloud services call for the creation of a new category of ‘cloud infrastructure services’ to be established and to get proper safe harbour protections. Search engines argue that they clearly fall under the category of caching services. Some intermediaries and representatives of the creative industry also consider the possibility to create a fourth category of ‘online platforms’ that would allow to distinguish between providers which have no editorial control over the content and those that use algorithms to display content to their users. Other information society services argue that DNS services should be explicitly covered as intermediaries.

Except for the creative industry, all categories of stakeholders consider it important to limit the responsibility of host providers, content distribution services, cloud infrastructure, DNS services and other intermediaries to prevent IP infringement, including piracy and counterfeiting.

Governance in the single market and supervision of digital services

There is a broad alignment from all categories of stakeholders that the internal market principle enshrined in the E-Commerce Directive is crucial for the development of innovative services in the Union and should be preserved.

With regard to the burdens for companies in the single market, business associations and medium-sized companies in particular pointed out that the legal fragmentation around rules for tackling illegal content, goods and services, is limiting most businesses, but especially SMEs and start-ups, from scaling up. More specifically, business associations pointed out that SMEs and start-ups are facing a competitive disadvantage, since they are affected in a disproportionate manner as opposed to larger companies. Start-ups and SME’s confirmed this observation, by pointing to the business risks of having to adapt their services to potentially 27 different sets of rules, which does not just inhibit their growth across the Union, but also globally.

At the same time, besides the need to address the refragmentation of rules, there is also a general understanding among stakeholders that cooperation between authorities should be improved in the cross-border supervision of digital services, and in particular online platforms. 66% of the respondents to the relevant question in the open public consultation noted that a unified oversight entity for EU oversight is very important. Many stakeholder groups, but especially business associations and companies, considered, that the degree of oversight should vary depending on the services’ obligations and related risks.

Authorities and other respondents, in particular academic institutions as well as civil society organizations point out the fact that the supervision of such cross-border services comes with specific challenges in terms of accessing appropriate data, as well as capability in terms of adequate financial and human resources in competent authorities tasked with supervision of online platforms. Many groups of stakeholders, especially digital rights associations, identified the need for interdisciplinary skills in the oversight entity, particularly in-depth technical skills, including data processing and auditing capacities, which would allow for the reliable and thorough assessment of algorithmic abuses.

While some authorities consider that the quality of the cooperation is good and reference to the Consumer Protection Cooperation (CPC), the European Regulators Group for Audiovisual Media Services (ERGA) and Body of European Regulators for Electronic Communications (BEREC) as good example of well-functioning cooperation, other authorities consider that the quality of their cooperation could be significantly improved.

Content creators and right holders are concerned with the fact that, while copyright is largely harmonised across the EU, there is no system in place for national authorities to cooperate on the enforcement of those rights.

Regarding the future governance structure, it is generally argued that EU cooperation is crucial and different suggestions of hybrid enforcement mechanisms with different elements of centralised and decentralised structures have been presented by the respondents. The majority of respondents, however, appears to favour a unified oversight entity, which would collaborate with national authorities. Some respondents have also pointed out the need to increase cooperation with international entities.

Some representatives from digital rights’ and consumers’ associations, trade associations, platforms and the creative industry consider that the oversight or the direct enforcement mechanisms could be better left in the hand of authorities operating at the national level, but overseen or coordinated by a central authority at the EU level. Some national authorities in the media sector in particular consider that online content regulation could go under the umbrella of the media regulator to endure consistency in the application of regulatory principles and to increase efficiency. While the internal market principle is often mentioned by respondents as a crucial pillar of the liability regime, some national authorities consider that the country of destination should be given greater capacity to intervene to discipline platforms who do not comply with the regulations, especially when the platform is established in one country but directs its content exclusively to other countries. Yet, this issue is controversial among national authorities.

The respondents show clear concerns about the lack of adequate financial and human resources and make often reference to the need to cooperate with civil society organisations and academics for specific inquiries and oversight. The stakeholders identify the need for interdisciplinary skills in the competent authority. These skills should include economics, law, sociology, media studies, computer science and data analysis. Particular interest is given especially to technical skills, which would allow to read and interpret algorithms’ source codes and assess if abuses occur such as self-preferencing, divergent treatment of equivalent content, intended or unintended failure with content recognition systems, etc. Some stakeholders also mention the need for the personnel in the competent authority to have some past experiences in the private sector, ideally in digital platforms or at least the digital ecosystem.



2.2.Open Public Consultation on measures to further improve the effectiveness of the fight against illegal content online 11 (30 April – 25 June 2018)

The Commission also consulted on some of these issues over the past few years through several other open public consultations. The most recent, was launched on the 30th April 2018 and ran for 8 weeks, with a total of.8,961 replies, of which 8,749 were submitted by individuals, 172 by organisations, 10 by public administrations, and 30 by other categories of respondents.

Overview of the replies:

Hosting services

Overall, hosting services did not consider that additional regulatory intervention would be conducive to better tackling illegal content online, but supported, to a certain degree, voluntary measures and cooperation.

Associations representing large numbers of HSPs considered that, if legal instruments were to be envisaged at EU level, they should in any case be problem-specific and targeted. They broadly supported further harmonisation of notification information, but expressed substantial concerns as to the feasibility of establishing strict time limits on takedown of content (from upload), pointing to burdensome processes especially for SMEs, and to general incentives of over-removal. They also pointed to the need to have a cautious approach concerning proactive measures, highlighting the general cooperation and good results in the actions taken through the sector specific voluntary dialogues.

Contributions from different companies highlighted the differences in available capabilities across businesses, as well as the different incentives and technical possibilities depending on their value propositions. Companies were also generally open to cooperation including with government agencies or law enforcement when it comes to flagging illegal content.

While big companies reported using, besides notice and action systems, proactive measures, including content moderation by staff, automated filters and, in some cases other automatic tools to flag potentially illegal content to human reviewers, responses also showed that smaller companies are more limited in terms of capability. Amongst the respondents, it seemed that SMEs were generally relying on notices for flagging all types of illegal content. One SME described difficulties in implementing semi-automated tools – without having access to the tools developed by the big industry players – and the trade-off experienced between increasing performance in removing illegal content, and the higher incidents of erroneous removal of legitimate content of their users.

Competent authorities, including law enforcement authorities, internet referral units, ministries or consumer protection authorities:

The main concerns expressed by those public authorities who responded were about illegal commercial practices (three respondents), child sexual abuse (two respondents) and copyright (two respondents).

Six respondents declared to identify and refer illegal contents to hosting service providers. Illegal content was mainly detected through trusted flaggers or by means of direct reporting by the right holders. For instance, one public authority declared that under national law, in case of infringement of copyright, the only party entitled to report the violation of such right is the right holder, whose right has been infringed. Automated tools are generally not used by the public authorities responding to the consultation.

Public authorities outlined the increasing difficulty to judge which content is harmful or illegal and which is not. Other public authorities reported their difficulty to identify the sources of the illegal content online and therefore the lack of evidence for any related judicial action. The turnaround time for removing illegal contents is considered as a critical point as well.

Some respondents required a clear and precise legislation which would take into account the different actors that operate in the EU, whereas others emphasised the importance of having strong and effective cross-border cooperation between the national regulatory authorities.

Trusted flaggers, notifiers with a privileged status and organisations representing victims

Amongst the respondents, 26 were mainly concerned with copyright infringements, five with child sexual abuse material and three with illegal commercial practices online.

Concerning the tools used to detect copyright infringements, 22 reported to use content monitoring and report by their own staff, whereas 18 declared to use automated tools. In such respect, more than half of the respondents reported that both public and private investments in research and development would be necessary to uptake and deploy automated tools for the removal of illegal contents online.

Some respondents warned as to the challenges in using automated tools and claimed that any technological system put in place to intercept content must be able to recognize new material quickly and accurately to account for changes. To ensure such result, human assessment had to be included in the decision-making process.

Furthermore, the not-for-profit organisations considered the standardised access and user-friendly interfaces for reporting illegal content to be very effective in order to enable HSPs to make diligent decisions. Conversely, the explanation of reasons and grounds of illegal content and anonymous notices were reported as being inefficient for some types of illegal content such as IPR infringements.

Amid the respondents, 21 declared the setting of time limits for processing referrals and notifications from trusted flaggers as important in supporting cooperation between HSPs and trusted flaggers.

Civil society organisations representing civil rights interests:

Despite having issues with the current framework, especially in terms of transparency of the processes for removing illegal content or contesting a removal, civil society organisations representing civil rights interests expressed concerns about the impact proactive measures or new legislative measures may have on freedom of expression and information. In this context, they were concerned that decisions by platforms about controversial content according to their terms of service in a non-transparent way may impact the rule of law and ultimately endanger freedom of expression.

Respondents agreed with reinforcing public authorities’ capabilities in fighting illegal content online and were not particularly in favour of attaching privileges to trusted flaggers or imposing an obligation on platforms to report all alleged illegal content to law enforcement bodies. Like respondents in other groups, they were not keen on out-of-court dispute resolution procedures either.

Several civil society organisations (and some respondents from other groups as well) considered that the focus should be put on searching and prosecuting providers of illegal content rather than on removing illegal content as this might have a negative impact on users' rights, whilst they also acknowledged that reaching and prosecuting the perpetrators is not always possible.

IP rights holders

Intellectual rights owners and their associations surfaced via different respondent groups in the public consultation. They include publishers, film and music labels, media and sports companies, as well as trademark owners.

In their view, the voluntary approach is rather ineffective and it puts companies doing more than required by law at a competitive disadvantage. Brand owners noted that counterfeiting does not only damage industry rights but consumer safety as fake products are often produced without complying with security standards. They criticized the enforcement of transparency obligations in Directive 2000/31/CE and considered that the “follow the money” approach has been difficult to implement. They claimed for a system of shared enhanced responsibilities for intermediaries supported by a stronger legal framework. Establishing “stay-down” obligations features in individual submissions too. Companies holding rights in sports events contended that platforms should enable them to take down content in real time.

Other industry associations

This group includes 76 replies from IT companies’ associations, other industry associations and other stakeholders such as the Council of Europe, one political party, civil rights advocates and Intellectual Property (IP) right holders.

Respondents reported low levels of feedback from platforms on notices to take down content. When content was removed, it was mainly done within days. One respondent noted that it is easier to report user generated content such as hate speech comments than false advertisements.

Although the majority of respondents saw a need for some level of EU action, many industry associations advised against complex regulations. In this regard, some of them highlighted that policies oriented along capabilities of large corporations create barriers to market entry and innovation. Prominent IT companies' associations underlined that the variety of policies and voluntary schemes in place should be given time to prove their results and be properly assessed before legislating. In their view, self-regulation and public-private cooperation should in any event be stepping-stones towards ensuring illegal content online is kept at bay. One respondent was however favourable to tackling terrorist content by legislating.

With the caveat of costs for small businesses, they are supportive of proactive detection tools counterbalanced by safeguards like transparency and the “human-in-the-loop” principles. They also agreed with the need for arrangements to prevent illegal content from spreading, but preferred best practice, voluntary sharing of databases or software tools to ensure the deployment of automated tools across HSPs. They were also in favour of standardising notice and action procedures, with a relevant industry association opposing this view.

Research or academic organisations:

Like other groups, respondents considered that different kinds of illegal content needed different frameworks. As regards the notice and action procedure, one respondent noted that outside Europe the take-down mechanism is unclear and sometimes non-existent.

They pointed to the lack of real incentives (despite sporadic media attention) for companies to deal with counter-notices, whereas non-treatment of notices can more easily lead to legal consequences. They also underlined that existing counter-notice procedures are by and large underused, with the few companies who do report on counter-notices listing on a yearly basis only one-to-two digits numbers. 12

They were particularly concerned about the use of automated tools in detecting illegal content online and advised caution when incentivising hosting services to apply proactive measures, and underlined the need for human rights safeguards and transparency to the process of detecting and removing alleged illegal content online.

They side with some other respondents in giving priority to public authorities' notices over trusted flaggers' ones; preferring public investments in research and development and in favouring publicly supported databases for filtering content, training data or technical tools.

Individuals

Is the internet safe?

-Over 75% of individuals 13 responding considered that the Internet is safe for its users, and 70% reported never to have been a victim of any illegal activity online. In cases, where respondents were victims, this concerned, for nearly 12%, some form of allegedly illegal commercial practice.

Measures to take down illegal content

-Regarding notice and action procedures: 33% of the individual respondents reported to have seen allegedly illegal content and have reported it to the hosting service; over 83% of them found the procedure easy to follow.

-The overwhelming majority of respondents to the open public consultation said it was important to protect free speech online (90% strongly agreed), and nearly 18% thought it important to take further measures against the spread of illegal content online. 70% of the respondents were generally opposed to additional measures.

Over-removal

-30% of the respondents whose content was wrongfully removed (self-reported) had issued a counter-notice.

-64% of the respondents whose content was removed found both the content removal process, and the process to dispute removal as lacking in transparency.

Transparency and information

-Nearly half of the individuals who had flagged a piece of content did not receive any information from the service regarding the notice, while one third reported to have been informed about the follow-up given to the notice. For one fifth, the content was taken down within hours.

-One fifth 14 of the respondents who had their content removed from hosting services reported not to have been informed about the grounds for removal at all.

Need for action & options

-30% of respondents considered that the current legal framework for tackling each of the different types of illegal content was effective. Nearly 40% found that actions currently taken by HSPs are effective.

-Nearly half of the respondents considered that hosting services should remove immediately content notified by law enforcement authorities, whereas 25% opposed such fast processing.

-Half of the respondents opposed fast removal for content flagged by organisations with expertise (trusted flaggers), other than law enforcement, but 25% agreed with such fast procedures.

2.3.Open Public Consultation on “Regulatory environment for platforms, online intermediaries, data and cloud computing and the collaborative economy” (September 2015-January 2016)

This consultation received 1,034 replies, although one of them (an advocacy association) included 10,599 individual contributions. 15 Its results, as far as the liability of intermediary service providers is concerned, can be summarized as follows:

·The majority of respondents think that the existing liability regime in the ECD is fit-for-purpose.

·The majority of respondents demanded either clarification of existing or the introduction of new safe harbours. The most often discussed safe harbour was hosting (Article 14), in particular its concept of “passive hosting”. When asked specifically about this concept, many respondents complained rather about the national interpretations of this concept. Several respondents supported clarification by means of soft-law measures such as recommendations issued by the European Commission.

·71% of respondents consider that different categories of illegal content require different policy approaches as regards notice-and-action procedures, and in particular different requirements as regards the content of the notice.

·61% of online intermediaries state that they have put in place diverse voluntary or proactive measures to remove certain categories of illegal content from their system.

·61% of the respondents are against a stay down obligation.

·77% of the respondents are in favour of increasing transparency with regard to the duties of care for online intermediaries, with regard to the general content restrictions policies and practices by online intermediaries.

·82% of the respondents are in favour of a counter-notice procedure.

·Views were particularly divided over (i) the clarity of the concept of a 'mere technical, automatic and passive nature' of information transmission by information society service providers, (ii) the need to clarify the existing categories of intermediary services (namely mere conduit/caching/hosting) and/or the potential need to establish further categories of intermediary services, (iii) the need to impose a specific duty of care regime for certain categories of illegal content.

2.4.Open Public Consultation on notice-and-action procedures (2012)

The public consultation revealed broad support for EU action (among all categories of respondents). More specifically it revealed strong support for clarification on certain notions of the ECD, for rules to avoid unjustified actions against legal content (in particular consultation of the content-provider and counter-notification by the content provider), for requirements for notices and for feedback to notifiers.

However, respondents appeared to be divided on the final instrument of the initiative.

·48% considered that if an HSP takes proactive measures it should be protected against liability that could result ("Good Samaritan clause").

·53% affirmed that action against illegal content is often ineffective and lacks transparency.

·55% considered that concepts of "hosting", "actual knowledge" and "awareness" are unclear.

·64% considered that HSPs often take action against legal content.

·66% considered that a notice should be provided by electronic means.

·71% considered that HSPs have to consult the content providers first.

·For 72% of the respondents, different categories of illegal content require different policy approaches.

·77% considered that the sender of the notice should be identified.

·80% considered that there should be rules to avoid unjustified or abusive notices.

·83% considered that the notice should describe the alleged illegal nature of the content.

2.5.Open Public Consultation on the E-Commerce Directive 16 (2010)

Full report available at http://ec.europa.eu/internal_market/consultations/docs/2010/e-commerce/summary_report_en.pdf .

3.Other, targeted consultations and engagement activities

3.1.Bilateral meetings and contributions

In the course of the preparation of this Impact assessment, the Commission has had bilateral meetings and/or has received position papers from the following stakeholders:

1."Challenger" (Dropbox, Spotify, Snap, Cloudflare, Mozilla, Etsy, TransferWise and Stripe)

2.13 organisations including Amnesty, FIDH, EFJ

3.AAFA - TRACIT

4.ACCC Australian Competition and Consumer Commission

5.Access Now

6.ADIGITAL

7.Advisory Council for Consumer Affairs (SVRV - Sachverständigen Rat für Verbraucherfragen)

8.AER – Commercial Radios

9.AFEP - Association française des entreprises privées

10.Ah Top

11.AIG Advertising Information Group

12.AIM

13.AirBnB

14.AK Europa

15.Algorithm Watch

16.Alibaba

17.Allegro

18.Alliance for Safety Online Pharmacy

19.Allied for Startups

20.Amazon

21.AmCham

22.Amway

23.APC - Association for Progressive Communications

24.Apple

25.ARD - ZDF

26.Article 19

27.Article 29

28.Association of Charity Lotteries

29.Association of the internet industry (ECO)

30.Avaaz

31.BASCAP

32.BDKV Bundesverband der Konzert- und Veranstaltungswirtschaft e.V.

33.Beat

34.BEUC

35.Bitkom

36.Bolt

37.Business Europe

38.Cabify

39.Calibermedia

40.CCIA

41.CDiscount

42.Center for Democracy & Technology

43.Center for Democracy & Technology

44.CENTR

45.Chanel

46.CISPE/Europa Insights

47.Civil Liberties Union for Europe (Liberties)

48.Civil Rights Defenders

49.Clever

50.Cloudflare

51.Confederation of CZ industry

52.Considerati

53.Copyright Stakeholders

54.Counter Extremism Project

55.Cyfrowa Polska

56.Dangerous speech . org

57.Danish Enterpreneurs

58.Danish Entrepreneurship Association (DINL)

59.DSA 4 Start-Ups

60.Dansk Ehverv

61.Deliveroo

62.Democracy Reporting International

63.Deutsche Startups

64.Developers Alliance

65.Digital Action

66.Digital Europe

67.Digital Rights Ireland

68.Digitale Gesellschaft

69.Direct Sellers Ass

70.DNS Belgium

71.Dropbox

72.EASA - European Advertising Standards Alliance

73.EBU

74.ECCIA - European Cultural and Creative Industries Alliance

75.Ecosia

76.Edima

77.eDreams ODIGEO

78.EDRi

79.EHHA - European Holiday Home Association

80.EGTA

81.Electronic Frontier Foundation (EFF)

82.eMag

83.EMMA-ENPA - European Magazine Media Association/European Newspaper Publishers' Association

84.Epicenter.works

85.ETNO-GSMA

86.Etsy

87.ETUC - European Trade Union Confederation

88.EU Travel Tech

89.GARM – Global Alliance for Responsible Media

90.EURACTIV

91.Eurocities

92.Eurocommerce

93.Eurogroup for animals

94.Federation of veterinarians of Europe

95.EuroISPA

96.European Partnership for Democracy (EPD)

97.European Public Health Alliance (EPHA)

98.European Publishers Council EPC

99.European Tech Alliance (EUTA)

100.Expedia

101.Facebook

102.Fédération française des télécoms

103.FiCom ITAS

104.FID – Forum for Information and Democracy

105.Finnish Commerce Federation

106.FreeNow (former My Taxi)

107.FTI Consulting

108.Gant - Lacoste

109.German Association for the Digital Economy (BVDW)

110.GESAC

111.Google

112.Homo Digitalis

113.Human Rights Monitoring Institute (HRMI)

114.IAB Europe

115.IBM

116.ICANN Internet Corporation for Assigned Names and Numbers

117.IFPI

118.IKEA

119.Incopro

120.INTA trademark association

121.Internet Watch Foundation

122.IOGT-NTO

123.IT & Telekomföretagen

124.ITI – The Information Technology Industry Council

125.IVSZ szövetség a digitalis gazdasagert

126.Julian Jaursch, SNV

127.Justitia

128.Kapten

129.Liberty Global

130.LVMH

131.Match group

132.Meetingselect

133.Motion Picture Association

134.Mozilla Foundation

135.MPA Motion Picture Association

136.Netflix

137.News Media Europe (NME)

138.Nielsen

139.Nike

140.NL Digital

141.OLX

142.Online marketplaces

143.Orange

144.OSEPI-BEUC

145.Panoptykon

146.PGEU online pharmacies

147.Pinterest

148.Polish Confederation Lewiatan

149.PubAffairs

150.QVC/Freshfields

151.Rakuten

152.Renaissance numérique

153.Reporters Sans Frontières

154.RIPE - Regional Internet Registries

155.Schibsted Media group

156.Sky

157.Skyscanner

158.Slack

159.Snap inc

160.Spotify

161.Startup Amsterdam

162.Svensk Handel

163.TechLeapNL

164.Telefónica

165.The Digital New Deal Foundation

166.The Marketplace Coalition

167.The Peace Institute

168.TIE Toy Industries of Europe

169.TripAdvisor

170.Twitch

171.Twitter

172.Uber

173.UK Trust

174.UK Business Consumer Coordination group

175.Verbraucherzentrale Bundesverband

176.Verisign

177.Virke

178.Vivendi Group-Canal+ group

179.VNG NL

180.Vodafone

181.Walmart

182.Welfare in Pet Trade

183.Wikimedia

184.World Federation of Advertisers

185.YouTube

186.Zalando

187.ZN consulting

3.2.Targeted consultations and feedbacks

The Commission has been consulting stakeholders during the last years and working towards specific due diligence measures, such as notice-and-action procedures, since 2012. These works informed the adoption of the 2017 Communication on tackling illegal content online 17 and the 2018 Recommendation on measures to tackle illegal content online 18 . A detailed reference to past events and consultations 19 (until mid-2018) can be found in the Impact Assessment accompanying the document Proposal for a Regulation on preventing the dissemination of terrorist content online 20 .

3.2.1.Feedback to the targeted survey for the Member States, 3 September 2020

During the summer 2020, the European Commission asked Member States to share their experiences on the overall functioning of the ECD. Altogether, 21 replies from 17 Member States (in one Member State 5 authorities replied) were received.

Although the survey mainly focused on the Member States’ experiences with the provision covered by Article 3 ECD – replies to this part are granularly described in Annex 7 and 14; the Commission also inquired about Member States’ experiences with other parts of the Directive, as well as about challenges and opportunities recognised on the Member States’ level.

Regarding Member States’ experiences with information requirements provided by the ECD (Art. 5 – 8), nine respondents reported that service providers are fully or mostly compliant with duties encoded in Art. 5 and 6, while three respondents explained that particular types of service providers (social media, web stores, marketplaces and third party sellers) usually or often do not comply with these provisions. One Member State reported that service providers are compliant only with Art. 5. One Member State noted that new ways for B2C communication with consumers might be reflected in the new framework (chat windows, direct messaging).

In their replies to questions covering conclusion of contracts by electronic means (Art. 9 - 11), six Member States were of the opinion that these provisions might be generally simplified and modernised, e.g. by codification of electronic signature, introduction of technological neutrality, omitting exceptions in Art. 9(2) that are not relevant any more, alignment with relevant consumer acquis or by introduction of smart contacts in the new framework; one respondent noted though that lack of adaptation on national level in this regard might be an issue. Three Member States reported that the respective provisions still serve well their purpose. One Member State expressed an opinion that harmonisation of international private law regarding these provisions is important to smoothen cross-border sales.

In replies to the questions concerning liability provisions (Art. 12 - 15), Member States positions covered several aspects. Two Member States reported that possibility to issue third countries notifications should be included in the new regulation. One Member State informed about use of point of contact in cooperation with service provider outside of its territory to report illegal content; one Member State reported that it does not issue removal requests to service providers established outside of its territory. One Member State explained that it uses a dedicated notice and action procedure and two Member States issue injunctions within their territory; one Member State reported it has no experience with injunctions. Two Member States further reported that they miss the statutory basis to issue injunctions. Two Member States reported that voluntary cooperation with service providers works in practice; another Member State reported in this regard that it has bad experience with administrative cooperation and subsequent content removal. One Member State called for clarification of the term “active” hosting, responsibilities of such service providers and for introduction of a duty to reply to any notices received. One Member State was of the opinion that alternative or online dispute resolution mechanisms are not appropriate for illegal content. Two Member States reported that cooperation with dedicated authorities, e.g. CERT (Computer Emergency Response Team), to signal illegal content, works well. One Member State expressed an opinion that non-compliance with legislation concerning notice and action procedure might be strictly prosecuted.

Regarding Member States experiences with codes of conduct (Art. 16) relevant for complying with the obligations laid down in the ECD, most of the respondents explained that they have not encouraged or developed such a practice. Three replies reported existence of dedicated codes for areas covered by Art. 5 – 8 to help deal with particular issues related to consumer protection online. Two Member States issued guidelines to help companies implement the Directive, while one of them ceased the scheme three years after introduction. One respondent explained that a dedicated cooperation was introduced in the past to tackle particular types of content online.

As far as it concern out-of-court dispute settlements (Art. 17 and 2018 Recommendation), majority of Member States replied that no dedicated procedures have been introduced or used, although five Member States clarified that ADR and ODR mechanisms works well for consumer protection issues and complement each other appropriately. Two Member States noted in this regard that the implementation of the AVMS Directive might introduce such a scheme for dedicated types of service providers. One Member State is preparing an update of sectoral legislation that might improve out-of-court dispute settlement via dedicated procedure.

Three Member States reported on the experience with contact points (Art. 19(4)) that they provide information and advice to both consumers and service providers; one of them further added that contact point also cooperates with LEAs, NRAs, NGOs and consumer centres. Two respondents explained that most of the queries raised to the contact point concern consumer related issues; one Member State noted that very little queries are being posed to the contact point. One Member State replied that manifestly illegal content is handled via CERTs and that the procedure has proven to be efficient in the take down of content. A Member State also runs a dedicated website to notify illegal content.

Member States did not share information about significant administrative or judicial decisions taken in their territory (Art. 19(5)) in the past five years; only one Member States provided information on relevant court decisions.

Concerning future challenges and opportunities, seven Member States perceived applicability of the new regulation to the third countries providers as important, while two Member States raised that the EU regulation should remain friendly to third countries. Five Member States explained that they see the lack of digital skills as one of the main challenges. Four Member States reported that they feel that internal market principles are endangered, also due to fragmentation of rules and lack of cooperation among Member States; one Member State explained in this regard that European body might solve insufficient cooperation among Member States. Compatibility with other relevant EU laws and importance of clear rules for companies, consumers and authorities was underlined as well. Three Member States explained that they perceive favourable conditions, including sufficient financial resources for SMEs as crucial element for continuous development of digital single market. One Member State reported enforcement and one access to public data as one of the biggest challenge.

3.2.2.Feedback to the Inception Impact Assessment on the ‘Digital Services Act - deepening the internal market and clarifying responsibilities for digital services’ 21 , 2 June 2020

A total of 110 contributions were submitted. The replies present broadly the whole stakeholder spectrum: online intermediaries, associations of businesses and trade organizations, telecom operators, startups, civil society, citizens and users of digital services, national authorities and academia. In this report, the common comments on the most mentioned topics are identified.

In general, it can be concluded that stakeholders are aligned on the common threat of the IIA. The focus of different stakeholder groups is however divergent; different stakeholder groups put more emphasis or importance on different topics. Generally, all stakeholders however agree that a horizontal harmonization for digital services in the EU is necessary and welcome.

As to the scope of the future legislative proposal, most online intermediaries, telecommunication operators, retail and media/audio-visual business associations and civil society organisations are in favour of including the services of third countries into the scope of the upcoming rules. Most start-ups, citizens, academia and national authorities did not say anything specific about the probable future scope of the DSA.

In general, online intermediaries, telecommunication operators, start-ups and national authorities are strong supporters of the Internal Market principle. Furthermore, none of the other stakeholder groups are against the principle either. Consumer organisations advocate for preserving the consumer contracts derogation of the ECD and national authorities call for additional deviation areas to be assessed.

Most stakeholders have shared views on liability. Especially online intermediaries and telecommunication operators are overwhelmingly supportive of maintaining the current liability exemption as defined in the ECD. Also most associations of businesses and trade organizations are in favour of maintaining the regime of the ECD. Consumer organizations strongly call for a special liability regime for online market places to make them directly or jointly liable in case they exercise a predominant influence over third parties or in case the platform fails to properly inform consumers or fails to remove illegal goods or misleading information. Most stakeholders also ask for clarification on the liability rules and are in favor of duty of care obligations and responsibilities for online intermediaries. Online intermediaries themselves stipulate that every party in the online ecosystems should hold some portion of responsibility. Very divergent views between all stakeholders exist on the question of whether a Good Samaritan clause should be introduced or not, and about the current distinction between ‘active’ and ‘passive’ digital services.

Especially online intermediaries, associations of businesses and trade organisations, civil society organisations and some academia emphasize the need for harmonised notice and action procedures across the EU. Most of them also call for the establishment of minimum information requirements that a notice should contain and touched upon the ‘knowledge requirement’ of the ECD, often asking for it to be clarified.

Online intermediaries, telecommunication operators, start-ups and civil society organisations are strong supporters of maintaining the general monitoring prohibition. Whereas civil society and online intermediaries largely consider the protection of fundamental rights the main argument for keeping the general monitoring prohibition, start-ups emphasized that freedom of speech is not their only reason, as general monitoring would simply be impossible for most start-ups to carry out. Apart from one civil society organisation, none of the other stakeholders indicated that the general monitoring prohibition should be cut out in the DSA.

Most stakeholders did not share particular views on the distinction between or definitions of illegal and harmful content. The majority of the online intermediaries however believe that the DSA should solely focus on illegal content and that harmful content should not be included in the DSA. In all other stakeholder groups, some respondents shared a similar view, and some respondents did not say anything about it. No contributions have strongly called for harmful content to be defined in the DSA.

In general, most stakeholders recognize the problem of the lack of transparency and call for increased transparency. Especially civil society organisations identified the lack of transparency as one of the major problems and call for more transparency obligations. Most stakeholders that touched upon the issue of transparency are in favour of reporting obligations. Online intermediaries furthermore specified the possible risks of far reaching obligations, such as infringements upon trade secrets or intellectual property rights. Some of the telecom operators, associations of businesses, civil society organisations and academia particularly highlighted the need for more transparency in automated and algorithmic systems.

Associations of businesses and trade organisations are the firm supporters of introducing KYC obligations on online intermediaries. Online intermediaries themselves also agree that such requirements could be useful, but emphasize that they should be proportionate, privacy-friendly and supported by the right infrastructure in order to be scalable. Most contributions of other stakeholder groups did not particularly specify whether to be in favour or against introducing KYC obligations in the DSA.

Particularly online intermediaries, civil society organisations, academia and national authorities emphasized the need for fundamental rights safeguards and shared their views on this. The freedom of receiving information, the freedom of expression and the freedom to conduct a business are mostly mentioned by online intermediaries, whereas civil society organisations put most importance on the preservation of the freedom of expression, freedom to receive information, right to fair trial and right to adequate remedy. Civil society organizations and academia are particularly worried that automated tools might not guarantee the protection of fundamental rights due to illegitimate takedowns.

The vast majority of stakeholders did not share specific views on online advertising, and the ones that did so touched upon divergent issues about it.

Most stakeholders emphasize the need of strong enforcement and regulatory oversight to hold platforms to their promises. Online intermediaries highlight that any regulatory oversight mechanism should be proportionate, increase legal certainty and should be based on the Internal Market principle. National authorities, telecommunication operators and civil society organisations specifically indicate that better cooperation between Member States is essential. Not many stakeholders clarified how this regulatory oversight should be in practice and did not touch upon the question whether a new EU body/agency should be established, but some of them think this would be a good solution.

3.3.Workshops, events and Expert Group meetings

3.3.1.Workshop on online violence against women, 8 September 2020

In September 2020, DG JUST in cooperation with DG CNECT organised an online workshop with a panel of six academics as well as representatives from the Commission to discuss the issue of violence against women in the online environment. Academics agreed that Digital Services Act could be an opportunity to overcome the existing fragmentation, and agree on more common definition/standards. An opinion resonated among the academics that parts of the Digital Services Act package should be perceived as complementary to tackling the issue together with supplementing sectoral initiatives. As the problem is structural, the solution should be based on complex market approach, so the users can switch to other platform that provides for different moderation may it be their wish. Some academics further concluded that an amplification element is important to distinguish harmful content and illegality, and that the horizontal solutions included in the Digital Services Act should cover all users in vulnerable situations, including women users, users with minority backgrounds and children. They also reported that the decision between the self- and co-regulatory approach on one side and “hard” regulation on the other should not be taken. At the same time, they acknowledged that here are clear positives and negatives of self- and co-regulatory approach, and its success depends a lot on the Member States’ as well on platforms’ approach. In this regard, an agreement was reached that scope for existing authorities to develop their role concerning privacy and different forms of online violence might be created by the new regulation. The academics also summarised that there is a need to adapt obligations according to the layers of the internet, as well as to ensure redress and support to individuals when considering illegal acts according to the existing rules.

3.3.2.Seminar on the Future of Liability of Platforms under the EU's Digital Services Act: An Academic Perspective, 31st July 2020

On 31st of July, Martin Husovec, Assistant Professor at London School for Economics, organized a small-scale workshop about the future of liability of digital platforms under the EU's upcoming Digital Services Act. The virtual event connected a small group of leading academics researching intermediary liability issues and the EU Commission officials at DG Connect working on the file of the Digital Services Act. The academic participants included Christina Angelopoulos (University of Cambridge), Joris van Hoboken (University of Amsterdam) and Aleksandra Kuczerawy (University of KU Leuven). The event's goal was to share the latest academic research with the EU officials and discuss potential solutions for the reform of the ECD, including drafting suggestions for the provisions related to intermediary liability and notice-and-action mechanisms.

3.3.3.Workshops on online marketplaces, 8, 10, 13 and 17 July 2020

The workshops were co-organised by DG CNECT and DG JUST, as part of a broader engagement with stakeholders and evidence collection strategy for the Digital Services Act package as well as the revision of the General Product Safety Directive.

The objective of the workshops was to gather up-to-date information on the state of play concerning the main challenges in addressing the sale of illegal goods online. It focused in particular on measures and good practices from marketplaces and the cooperation with authorities and responsible third parties. Panellists and participants – which included online marketplaces, retail associations, consumer organisations, national market surveillance authorities as well as representatives from the European Commission - were invited to share their experiences and engage in a discussion on potential new policy and regulatory measures.

The event was made of four separate online sessions:

Session 1: Sellers and products identification mechanisms, 8 July 2020 - The first session was focused on the information online marketplaces are currently gathering on their sellers. Online marketplaces started with a short overview of practices in identifying their business sellers and product listings on their platforms. Most of the participating online marketplaces specified that business sellers are required to submit background information (e.g. company name, VAT number, address, etc.) before being admitted to sell. Some participating market surveillance authorities stated that while seller identification is key, the essential point to ensure proper control is the traceability and identification of the dangerous product itself.

Overall, all participants agreed on the importance of having transparency as regard business traders. Some participants highlighted that more should be done in this context, especially when it comes to sellers established outside the EU and therefore not always covered by EU rules. Some stakeholders considered that more cooperation with authorities in Member States could also help identifying rogue sellers.

Session 2:  How to tackle dangerous goods and product safety issues online: notice and action procedures and the role of the Safety Gate/RAPEX - The first part of this session concerned best practices on notice and action procedures to tackle dangerous goods, including notices from authorities, consumer associations, consumers and other actors. Generally, all participants agreed that a harmonised notice and action procedure would facilitate the fight against dangerous products online. Some participants highlighted that often notices are not accurate enough and online marketplaces have difficulties in identifying the dangerous products notified. In this regard, many participants called for a minimum information requirement for notices. Online marketplaces also stated that filters are not entirely reliable and that such tools should always be accompanied by human review and notice and action mechanisms.

The second part of the session concerned Safety Gate/RAPEX. In this regard, a number of investigations carried out by consumer organisations, retail associations and market surveillance authorities were also presented, with results on the number of dangerous products available online raising clear concerns. Marketplaces are taking some action, such as periodically checking Safety Gate/RAPEX (as they have committed in the Product Safety Pledge). Some participants pointed out, the information in the Safety Gate only shows only part of the issue and more needs to be done in this regard. Some remedies were proposed by national authorities, such as establishing an obligation to cooperation with market surveillance and custom authorities. Some participants also suggested to have an API interface to Safety Gate/RAPEX which would then be linked to online marketplaces and allow them and consumers to have real-time information on product safety.

Session 3:  What other measures and challenges for keeping consumers safe from dangerous goods online?The session focused on other preventive measures that marketplaces can take to ensure that no dangerous product is placed on the market. Three main aspects were mentioned by participants. First, the importance of data, that in many cases is not provided by the seller, making enforcement very difficult. Second, online sales and product safety are global issues, therefore international cooperation is key to address these challenges. Thirdly, many participants mentioned the issue around traceability, and how it needs to be enhanced so dangerous products sold online can be correctly identified and corrective measures can be enforced by both platforms and authorities. The challenge of reappearance of dangerous products already removed was also addressed, although not specific measures or solutions were mentioned by participants.

Session 4: Consumer law and online marketplaces, 17 July 2020 -The main focus of this session was to address content that is illegal because it constitutes a violation of applicable EU consumer law.

The session started with a short presentation held by DG JUST on the relevance of EU consumer law for a) online marketplaces regarding their own activities and content; b) the business users of online marketplaces; and c) online marketplaces in their capacity as hosts of their business users.

The discussion then zoomed in on third-party content and the measures that online marketplaces are taking to prevent activities that violate applicable EU consumer law. Online marketplaces specified that their objective is to create trust on the platform, both for consumers and sellers. They further stated that sellers are in charge of their own compliance, but that they are responsible to give them the means to be able to be compliant with EU law.

Some participants flagged that the main problem with EU consumer law is the lack of resources and enforcement.

Cooperation was also mentioned by many participants as being the key to ensure a coherent enforcement of EU consumer law. According to many participants, all the actors in the supply chain should work together to raise awareness around consumer rules.

3.3.4.Workshop on Recommender Systems and Online Ranking, 9 July 2020

In July 2020, DG CNECT and the Laboratoire national de métrologie et d'essais (LNE) organised a workshop with six experts on recommendation systems.

Participants recognised that recommender systems have important impact in the online dissemination of information and can bring serious societal harms. This is also exacerbated by the fact that moderation processes are often opaque and there is a clear lack of governance, while recommender systems are core part of the platforms’ business model and value proposition.

Participants emphasized the importance of regular oversight to account for the rapid evolution of these systems and the risks they bring. According to some experts, among the biggest challenge when observing recommendation systems is the access to data. In particular, accessing relevant output data for observing the effects and prioritisations made by the systems, differs from one system to the other – platforms sometimes make available interfaces for download, sometimes are completely opaque. Web scraping is relatively costly, and is not reliable for observations of the evolution in time. It can also be explicitly in violation of platforms’ terms of service.

The experts also discussed the methodological challenges and emerging research in tools to test and compare outcomes of recommender systems. Experts also pointed to the costs in conducting research in these areas, as well as the pressing need for further insights and protections to users who would need to be meaningfully informed and empowered, but should not be solely responsible for protecting themselves. Experts emphasised the need for further independent oversight.

3.3.5.E-Commerce Experts Group meeting, 26 May 2020

During the meeting of the Expert group on 26 May 2020 22 , preparation of Digital Services Act package was presented in details and discussed with the Member States. During the discussion, Member States underlined that the new rules should be in particularly friendly towards small and medium enterprises, and stressed that some aspects should be regulated and harmonised especially carefully. In particular Member States stressed the corner stones of the ECD – country of origin, liability exception, prohibition of no general monitoring – should be kept while expressing the willingness to modernise them.

3.3.6.Workshop on the liability of Domain Name Systems service providers under the ECD, 14 February 2020

The Workshop was co-organised by E3 and F2 Units DG CNECT as part of a broader engagement with stakeholders and evidence collection strategy for the Digital Services Act package.

The objective of the workshop was to discuss within panel of academics what is the EU legal framework for Domain Name registries and registrars, whether there is further precision needed in this regard to ensure legal certainty and fair balance between the safety objectives and the protection of fundamental rights. While the panellist has not reached consensus on all aspects discussed during of the workshop, they agreed that a clarification of the role of the DNS going forward appeared beneficial. The most prominent aspects concerning the ECD relate to other intermediaries, namely hosting providers or online platforms. Compared to these, the DNS, i.e. the logical layer, plays a less prominent role. Also in the future, the DNS is likely to play a subordinate role in relation to content compared to hosting providers or platforms. Yet, in some instances the clear borders between intermediaries on the infrastructure and intermediaries on the application layer are becoming blurrier. Several experts noted the increasing interest in the DNS in content debates.

In conclusion, panellist suggested several areas of interest in relation to the DNS:

·The clear inclusion of DNS actors amongst service providers enjoying liability exemptions;

·Cascade of intermediary functions: The distinction from other intermediaries based on the nature of the DNS and how different roles impact the balance of interests and fundamental rights concerned;

·Distinction among DNS actors in relation to DNS functions: Whether distinctions should be made between domain name registries, registrars, resellers, actors involved in handing out IP addresses and other service providers;

·Transparency and procedure of domain-name related actions: Consider the framework for actions in relation to transparency and procedure e.g. in the context of Recommendation (EU) 2018/334 and its applicability to voluntary arrangements.

3.3.7.E-Commerce Experts Group meeting, 8 October 2019

During the E-Commerce Experts Group meeting on 8 October 2019 23 , the main principles of the ECD has been discussed with Member States, as well as the latest development on national levels. On the ECD principles, some Member States agreed that one of the main difficulties is in devising a common effective law against harmful or hateful content. Fundamental rights in relation to tackling harmful content online were discussed as well. Managing fragmented rules is often only possible for large platforms; mutual recognition was suggested by some as a possibility to solve the issue. On liability, the Member States discussed how the exemption from liability fits within the changes in the online environment that have taken place over the last 20 years, as well as within its enforcement. The possibility to introduce a Good Samaritan clause as a legal provision was also discussed. A group of Member States was of the opinion that some services currently covered by the liability regime should not continue to be covered in the future. This could include the provision of services that can no longer be claimed are provided passively. As a result, large platforms should be the subject of stricter rules. During the discussion of the Commission’s Communication and Recommendation on tackling illegal content online, Member States expressed the need to preserve freedom of expression. Some Member States noted a perceived convergence of measures that tackle illegal content online and harmful content online and raised concerns that as illegality is not harmonised, this can cause jurisdiction problems. Member States also reported that the increased fragmentation, on both the national and the EU level, makes it difficult for online service providers, particularly SMEs, to comply with legislation. They also underlined that self- and co-regulatory initiatives should also be considered for particular types of actions. On cooperation mechanisms that are set-up by the ECD, Member States confirmed usefulness of cooperation, but they also highlighted issues requiring more attention. Member States reported different experience, with some using the IMI mechanisms relatively often and some not at all. Some Member States emphasised the importance of contact points, with some Member States suggesting harmonisation via one contact point for illegal content and requested that the cooperation procedure should be easy to use. Member States also reported that the increased fragmentation, on both the national and the EU level, makes it difficult for online service providers, particularly SMEs, to comply with legislation. On the sustainability of the framework for SMEs, the Member States stressed that the basic internal market principle and liability exemptions are crucial for companies to grow. They also explained that SMEs encounter obstacles when they want to expand their business, arising very often from different national rules and lack of (full) harmonisation.

3.3.8.Semi-structured interviews with judges across the EU, July 2017 24

In total five judges were interviewed over June and July 2017, including representatives from the United Kingdom, Germany, Italy, the Court of Justice of the European Union and the European Court of Human Rights. General views collected through these interviews include:

·Different levels of experience in cases involving intermediary services among Member States. That affects understanding and consistency in applying the liability regime.

·The liability regime is still useful but will require more flexibility as technology evolves. Any replacement of this regime would require a careful balancing of interests.

·Different categories of illegal content should be treated differently.

·Need to decide case-by-case whether an intermediary plays an active or a passive role.

·More clarity and standardisation of minimum requirements of notices would be useful.

·Setting one fixed deadline to take action on notices would not be appropriate.

·Lack of clarity of recital 42 of Directive 2000/31/EC. Uncertainty as to whether the use of algorithmic or automatic process to detect illegal content renders service providers active.

·The use of automated processes is pushing in the direction of a general monitoring obligation. The ban on such obligation is still useful, although for several judges it might become less so in the future.

·Relying on intermediaries to police the Internet is risky. If the Commission wishes to encourage this, it should provide clear guidelines on what content is considered illegal.

·Judges considered that in principle judicial oversight was more appropriate in regards to rule of law than private standards.

·There was calls for new legal processes (such as Internet courts) to allow judges to deal with potentially illegal content online quickly.

Annex 3: Who is affected and how?

1.Practical implications of the initiative

Main positive impacts and the affected stakeholders

The initiative would have a positive effect on the functioning of the single market. In particular, it would support access to the single market for European platform service providers and their ability to scale-up by reducing costs related to the legal fragmentation. Moreover, it would improve legal clarity and predictability regarding the liability of online intermediaries, among others. It would also increase transparency about content moderation, recommender and advertising systems, and the business users of online platforms to the benefit of consumers, regulators, researchers and civil society. The new EU level governance structure would improve trust and cooperation between Member States, facilitate effective enforcement across borders, and reinforce the internal market principle of the E-Commerce Directive.

With regards to competition, the harmonised legal requirements would establish a level playing field across the single market, while the limitation of asymmetric obligations to very large online platforms with a systemic impact in Europe would make sure that smaller, emerging competitors are not captured by disproportionate measures. The initiative is proportionate and would not impose dissuasive requirements for service providers.

With the additional legal certainty, the initiative is expected to have a positive impact on competitiveness, innovation and investment in digital services. The harmonised measures would cut the costs of the evolving legal fragmentation and the extended scope would create a true regulatory level playing field between European companies and those targeting the single market without being established in the EU. The intervention would preserve the equilibrium set through the conditional liability exemptions for online intermediaries, ensuring that online platforms are not disproportionately incentivised to adopt a risk-averse strategy imposing too restrictive measures against their users, but they can take voluntary measures against illegal activities. The initiative would also have a positive effect on the competitiveness of legitimate business users of online platforms, manufacturers or brand owners, by reducing the availability of illegal offerings such as illegal products or services. Additional profits are expected to largely overcome the costs of the notice and action mechanism. More transparency would build further resilience into the system, giving more choice and agency to users and stimulating an innovative and competitive environment online.

The initiative is expected to diminish illegal trade into the Union without having an adverse effect on legitimate platforms targeting the single market from third countries.

The initiative would greatly increase online safety for consumers by adding more harmonisation to the tackling of all types of illegal content, goods and services across the Union. It would accelerate cooperation with law enforcement, national authorities and trusted flaggers under EU level supervision, and it would stimulate online platforms to take additional measures, proportionate to their capability, adapted to the issues and illegal content they most likely host, and in full respect of fundamental rights. The reinforced EU level supervision and cooperation would be able to monitor the performance of the notice and action and broader moderation, as well as recommender and advertising systems to protect legitimate users and avoid over-removal of legal content.

The intervention would also tackle systemic risks posed by online platforms particularly through transparency obligations and asymmetric measures imposed on very large platforms. It would correct information asymmetries and empower citizens, consumers in particular, businesses and other organisations to have more agency in the way they interact with the digital environment. Accountability mechanisms would ensure that researchers and competent authorities could assess the appropriateness of measures taken by platforms in co-regulatory processes.

Costs for businesses, SMEs, public authorities and the EU

The costs incurred by online intermediaries would represent a significant reduction compared to those incurred under the present and evolving fragmented and uncertain corpus of rules. At company level, the legal intervention could lead to a cost reduction of around EUR 400.000 per annum for a medium sized enterprise, but this could go up to 4-11 million EUR per annum for a larger company. Direct costs for the main due diligence obligations depend to a large extent on the number of notices and counter-notices received by a platform and cases escalated to an out of court alternative dispute resolution system. The existence of alternative dispute resolution mechanisms is likely to append negligible costs compared to the current system. The additional design, maintenance and reporting costs for the information and transparency obligations are expected to be marginal and absorbed into the general operations and design costs of online platforms and ad intermediaries, respectively. Costs related to information requirements would equally be reduced rather than increased, compared to the baseline, due to streamlining and harmonising. The only potentially significant increase of costs would result from the enhanced due diligence obligations that are limited to very large online platforms with systemic role and competitive advantage fuelled by network effects. These costs would vary depending on the design of the systems but are expected to be absorbed in the services’ operations in any event.

For SMEs, the costs of the legal fragmentation seem completely prohibitive today. The initiative would make it much more feasible for SMEs to enter into the single market and scale up. However, the introduction of standard, minimum requirements for notices, procedures and conditions, as well as reporting templates, should further decrease the expected costs for small companies.

For public authorities, any additional measures to mutualise resources and expertise and to establish sound IT infrastructures for cooperation can have a net positive effect in assisting all Member States in the medium to long term. Compared to the baseline, the initiative should cut significantly the costs brought by the inefficiencies and duplication in the existing set-up for the cooperation of public authorities. Net cost reductions, however, are not expected, due to the volume of illegal activities online. Member States where a large number of services are established are likely to need some reinforcements of capabilities, but these will be attenuated through the creation and use of the Digital Clearing House. National Digital Coordinators would incur some costs, but the efficiency gains from mutualisation of resources, better information flows and straight-forward processes are expected to overweight them in every Member State. The additional cost of the EU level oversight, including the EU Board and Secretariat, would be born at EU level, creating further efficiency gains in the cooperation across Member States.

2.Summary of costs and benefits

I. Overview of Benefits (total for all provisions) – Preferred Option

Description

Amount

Comments (main recipients)

Direct benefits

Reduced costs related to legal fragmentation (i.e. compliance costs)

Cost reduction of around EUR 400.000 per annum for a medium enterprise (up to 4-11 million EUR for a company present in more than 10 Member States)

All intermediary services, especially small and medium sized hosting services and small and medium sized online platforms

Improved legal clarity and predictability

All intermediary services

Increased transparency regarding content moderation, recommending and advertising systems

Cutting costs of uncertainty over which reporting system to use

Agency based on information for making real choices rather than dependent on design features from platforms

Citizens, businesses, regulators, researchers, civil society

Stronger and more efficient cooperation between Member States

General cost reduction by streamlining the cooperation mechanisms, cutting inefficiencies and obtaining results

Member States, national authorities – primary recipients, and better results overall for citizens, services and other businesses

Increased transparency of potential business wrongdoers (Know Your Business Customer)

Dissuasive for the majority of sellers of illicit products

Legitimate businesses, national authorities, consumers

Reduced information asymmetries and increased accountability

User empowerment to make informed choices

Users, including citizens, businesses and society at large

Fundamental rights and protection of legitimate users and content

All citizens and businesses, in particular journalists and other content providers

Indirect benefits

Increase of cross-border digital trade and a more competitive and innovative environment

1 to 1.8% (estimated to be the equivalent of an increase in turnover generated cross-border of EUR 8.6 billion. and up to EUR 15.5 billion)

All digital services and businesses

Diminished illegal trade into the Union

Increased online safety

Reduced systemic risks posed by large online platforms

Citizens, businesses, smaller digital services and society at large

II. Overview of costs – Preferred option

Citizens/Consumers

Businesses

Administrations

One-off

Recurrent

One-off

Recurrent

One-off

Recurrent

Notice and action

Direct costs

Minimal time spent on sending a notice – this should not be a significant costs, but rather an overwhelmingly important reduction of costs compared to the current unclear and deeply fragmented system

1500 – 50.000 EUR

Depends on volume of notices, expected to decrease overall

(estimated range: 0 to 16 mil EUR)

Indirect costs

Complaint and redress mechanism

Direct costs

Costs of technical design (minimal)

Costs of maintenance (absorbed in the costs for notice and action estimated above)

Indirect costs

Alternative dispute resolution

Direct costs

Depending on dispute

Depending on dispute

Negligible

Negligible

Indirect costs

Know Your Business Customer

Direct costs

Costs of design

Marginal costs per business customer

Indirect costs

Transparency obligations

Direct costs

Marginal technical design costs for development, data collection, absorbed in the development of technical systems

0.1 and up to 2 FTEs

Indirect costs

Legal representative

Direct costs

Estimated between EUR 50.000 to EUR 550.000 per annum, depending on the FTE necessary to complete the tasks. These costs can be partially or fully absorbed, for most companies, in existing requirements for legal representatives.

Indirect costs

Risk management obligations

Direct costs

Risk assessments: estimated between EUR 40.000 and EUR 86.000 per annum

Audits: between EUR 55.000 and 545.000 EUR per annum

Risk mitigation measures are variable costs and can range from virtually no costs, to significant amounts, in particular when the platforms’ systems are themselves causing and exacerbating severe negative impacts. The duration and level of expenditure for such measures will also vary in time. Similarly, participation in Codes of conduct and crisis protocols require attendance of regular meetings, as a direct cost, but the streamlined targeted measures can vary.

Indirect costs

Ad archives

Direct costs

Up to 220.000 EUR for building APIs to give access to data and quality controls for data completeness, accuracy and integrity, and for system security and availability

Marginal maintenance costs

Indirect costs

Compliance officer

Direct costs

Between 1 and 5 FTEs for very large platforms

Indirect costs

Digital Clearing House

Direct costs

2 mil per annum over the first two years for technical development.

Maintenance and additional development over the next 3 years of approx. EUR 500.000 in total

Indirect costs

EU Board and Secretariat

Direct costs

0.5 – 1 FTE for participation in the Board – per Member State

European Commission : 50 FTEs + EUR 25 mil operational budget

Indirect costs

Supervision and enforcement (Digital Services Coordinator national level)

Direct cost

For core due diligence obligations on intermediaries:

varying from 0.5 FTEs up to 25 FTEs, depending on scale of services hosted 25

For supervision of very large platforms Costs expected to fluctuate depending on inspections launched. For one inspection/audit, estimates between EUR 50.000 and EUR 300.000

Indirect costs

Annex 4: Analytical methods

1.Cost of non-Europe: legal fragmentation and cross-border provision of services

The identification of the costs of non-Europe related to legal fragmentation focused in particular on the different approaches of rules transposing the E-Commerce Directive (ECD) governing how services, and in particular intermediaries and platforms, shall deal with illegal content, pursuant to Article 14 ECD.

An estimation of the costs made by JRC draws on the cross-trade barriers the differences of applicable laws in different Member States may create. To estimate those barriers, an indicator of the legal distance (i.e. differences) in transposing/applying Article 14 across different pairs of Member States has been drawn, and correlation with cross-border traffic as a proxy of cross-border trade has been verified, on the basis of a general trade model. The models and the methodologies applied are described in detail in Annex 4.

Legal distance

“Legal distance” is a concept that represents differences in laws and regulations across countries. JRC identified an indicator that quantifies a legal distance between EU MS in regards to the transposition and subsequent implementation of the intermediary liability exemption for hosting services, as introduced in Article 14 ECD. The process of constructing the indicator had two distinctive parts.

First, JRC performed a legal analysis of the ECD and reviewed relevant literature relating to the issue of liability of Intermediary Service Providers (ISPs). This was followed by an analysis of previous studies dealing with the issue of legal fragmentation stemming from the transposition and implementation of the ECD.

Second, JRC quantified a legal distance between EU MS with respect to the transposition of Article 14 ECD. The indicator builds on the updated results of the Report produced for the European Commission in 2018. 26 In the construction of the indicator, JRC considered the burden of adaptation that ISPs have to face in order to comply with the legal rules that transpose the ECD into national systems. The final values of the indicator convey information on how the different MS transposed Article 14 of the ECD into their national legislations. The “legal distance” between two countries is simply the absolute difference of the values of this indicator, and shows how “close” or “far away” the legislation of two MS is. The indicator includes the following components:

·Obtaining knowledge – this indicator’s component reflects coerciveness of a particular way of “obtaining knowledge.” The most coercive option is considered the most costly and is ascribed with the highest value. The component ascribes the following values: 1-various ways of obtaining knowledge or no specification, 2-minimum requirements notice, 3-court/authority order or manifestly illegal content (most coercive);

·Existence of a specific and platform-managed N&A procedure – this indicator’s component reflects the cost of adaptation driven by the laws introducing N&A procedures. The components ascribes the following values: 0-no procedure, 1-horizontal laid down in law, including co-regulation, 2-sectorial procedures only (regardless their legal status). The lowest value indicates that there is no legal requirement to adapt. Horizontal procedures are less costly than sectorial, since they introduce uniform compliance mechanisms;

·Specification of information to be provided in a notice – this indicator’s component ascribes two values: 0-not specified, 1-minimum requirements. The more a national legislation regulates the level of information required, the more the platform needs to adapt to each system, therefore, the minimum requirements are more costly for an service provider;

·Timing of the removal – this indicators component ascribes the following values: 0-no specification of timing, 1-timing specified > 24h , 2-timing specified < 24h. The shorter the timing, the more an ISP needs to adapt, which incurs costs;

·Existence of the counter-notice procedure – this indicator’s component ascribes two values: 0-No, 1-Yes. The existence of the counter-notice procedure incurs costs for an ISP;

·Abusive notice remedies – the indicator’s component ascribes two values: 0-there are remedies, 1- no remedies. The value “1” is ascribed when there are no remedies, to reflect the burden relating to the increased number of notices;

·Reporting obligation – this indicator’s component ascribes the following values: 0-no reporting obligations, 1-Yes, there are reporting obligations;

·Internal appeal system – this indicator’s component ascribes two values depending on the existence of the obligation of appeal system internal to an ISP: 0-No, 1-Yes.

·Extraterritorial application of the rules on N&A – this indicator summarises whether a Member State requires its rules to be applied also to ISP established in other Member States (including through a legal representative): 0-No, 1-Yes 27 .

All the components of the indicator are valued for each EU MS. The differences in total values of the indicator between MS illustrate a legal distance between the national regimes with respect to the transposition of Article 14 of the ECD.

Description of traffic data and methodology the gravity equation and trade costs

In order to study empirically trade costs and the barriers to market integration, the standard procedure in economics is to employ the gravity model of trade. This model captures the interactions between country pairs. In this case, the variable of interest is internet traffic, i.e., the set of cross-border visits to websites located the EU MS originating from visitors located in a different MS in 24 different categories of activities.

With regard to the traffic data, the top 100 websites per each of the 24 categories of digital activities and for the 20 EU MS 28 Similarweb collects data for have been identified. First, through a DNS 29 lookup, we have identified to which country the different domains correspond. Second, we have downloaded the geographic breakdown of the traffic directed to this domains for three different moments in time: the months of April 2018, April 2019 and April 2020. Third, we have restricted the analysis to domains that appear in all three periods. In so doing, we are able to build internet traffic origin-destination matrixes, as the measure of trade in digital services. Accounting for duplicates in the top 100 lists, and the fact that some domains only appear in one time period, this procedure gives a total of 31084 different domains used for the empirical analysis. Figure 1 shows the evolution of the total volume of visits, while figure 2 shows the distribution by category.

Figure 3: Evolution of total internet traffic in the EU (in M visits) - Source: JRC elaboration with data from similarweb.com

Figure 4: Distribution of total internet traffic in the EU, by category (in M visits) - Source: JRC elaboration with data from similarweb.com

The majority of traffic to websites comes from local users, i.e., there is relatively little cross-border volume of internet visits, as shown in figure 3:

Figure 5: Evolution of local vs. cross-border Internet visits in the EU - Source: JRC elaboration with data from similarweb.com

However, there are important differences by sector given by how tradable some services are. Figure 4 indicates that services such as Law and Government and News and media, for instance, tend to be more local than the average since public services and news tend to be tailored to local tastes, preferences and needs. On the other hand, Games and Tourism show a higher volume of cross-border trade.

Figure 6: Local vs cross-border Internet visits in the EU, by category - Source: JRC elaboration with data from similarweb.com

The gravity model of trade also includes local visits, visits to residents in one country to websites located in the same country, as a measure of “domestic” trade or “home bias”.

Including domestic trade in gravity estimations is justified by several arguments. First, since consumers face the option to consume both domestic and foreign products, this guarantees consistency with theory and also with stylised facts about consumer behaviour. Second, it allows the identification of the effects of bilateral trade policies in a theoretically-consistent way (Dai et al., 2014). Third, it measures the relative effects of distance on international trade with respect to the effects of distance on internal trade (Yotov, 2012), the so-called “distance puzzle” in trade. Finally, it controls for the effects of globalization on international trade and corrects the potential biases in the estimation of the impact of trade agreements on trade (Bergstrand et al., 2015).

In the literature, the basic log-linearised regression equation is:

           (1)

The variable indicates internet traffic from country i to destination j, directed to website d in time t. When i and j differ, X captures international trade, and when i=j, then X reflects intra-national trade, or the so-called home bias. Since we have different websites in each country, we differentiate between domains through the sub-index d, while t is the month.

Additionally, indicates a vector of different bilateral distances that are commonly used in trade studies to capture trade costs, such as contiguity, physical distance, common language or common currency. The term μij denotes the set of country-pair fixed effects, which serve one main purposes: it will absorb most of the linkages between the endogenous trade policy variables and the remainder error term εij,t in order to control for potential endogeneity of the former. In principle, it is possible that the error term in gravity equations may carry some systematic information about trade costs. However, due to the rich fixed effects structure in equation (1), we are more confident to treat and interpret εij,t as a true measurement error. Next, the term is the set of domain fixed effects, to control for the heterogeneity of sizes and categories of the different websites, as well as for additional factors that may influence consumer behaviour such as brand or type of website. Similarly, represents month fixed effects and controls for the time effects due to seasonality or trends in e-commerce interest. Finally, εij,t is the error term.

Results

The results of the trade model identified a negative correlation 30 between the legal distance indicator and the cross-border traffic (the higher the legal distance, the lower the cross-border traffic), outlined in the table below. A reduction/harmonisation of rules in this regard could improve cross-border trade in terms of traffic between Member States in a range between [1% and 1,5%].

(1)

(2)

(3)

(4)

(5)

VARIABLES

Physical distance (log)

-0.121***

-0.122***

-0.121***

(0.00188)

(0.00188)

(0.00189)

Legal distance (log)

-0.0107***

-0,0155***

(0.00133)

(0.00131)

Contiguity

0.104***

0.209***

0.210***

0.101***

0.104***

(0.00252)

(0.00188)

(0.00187)

(0.00252)

(0.00252)

Common language

0.222***

0.233***

0.233***

0.222***

0.222***

(0.00304)

(0.00304)

(0.00305)

(0.00304)

(0.00304)

Common currency

0.0517***

0.0498***

0.0516***

0.0496***

0.0517***

(0.00188)

(0.00188)

(0.00187)

(0.00189)

(0.00188)

Home bias

0.780***

0.979***

1.009***

0.752***

0.780***

(0.00381)

(0.00222)

(0.00227)

(0.00418)

(0.00428)

Constant

1.978***

1.203***

1.169***

2.019***

1.978***

(0.0145)

(0.00766)

(0.00764)

(0.0148)

(0.0148)

Observations

1,222,164

1,222,164

1,222,164

1,222,164

1,222,164

R-squared

0.316

0.313

0.314

0.316

0.316

Robust standard errors in parentheses

*** p<0.01, ** p<0.05, * p<0.1

Dependent variable: visits to domains located in the different MS, from users located in the same country and from users from other EU MS (online trade captured by internet traffic –information flows over the internet).

Legal distance: how different the transposition of the ECD has been in the different MS pairs.

2.Estimates for company-level costs

Estimates are based on averages established based on data reported by companies for the notice and action and transparency obligations in the German law over a period of 6 months. As there are significant differences in the scale of notices received and resources invested by different companies, estimates were corrected based on simulated data from a model built by the JRC for a full content moderation process a company could put in place.

To estimate the duplication of costs across Member States, the indicators for the legal distance 31 were also used to correct coefficients for the duplication of costs in scenarios of the evolving legal fragmentation.

For the additional costs on very large platforms, estimates are based on:

·Average FTE costs of EUR 110.000

·Benchmarks of risk assessments in the financial sector 32 and estimated costs of technical audits 33

·Reported data from stakeholders for maintenance of databases.

3.Definition of ‘very large platforms’

An important driver of the identified problems is the unique situation of the largest online platforms, such as social networks or online marketplaces. A relatively small number of online platforms concentrate a very high number of users – consumers and traders alike. Very large platforms represent a higher level of societal and economic risk because they have become de facto public spaces, playing a systemic role for millions of citizens and businesses. In other words, they have a significantly higher impact on society and the Single Market than smaller platforms because they reach a large audience.

When designing the definition of very large platforms, it seems therefore that the most important factor is the number of users, as a clear proxy for the levels of risks they pose. This is the key metric that propels rapid growth and leads to significant societal and economic impacts.

A similar methodology of focusing on the number of users as a proxy could be observed in recent policy initiatives regarding online platforms around the world (e.g. NetzDG (DE) – special obligations on online platforms with more than two million registered users (2.5% of DE population); ACCC Digital Platforms Inquiry (AU) – special recommendations for platforms with more than one million monthly active users (4% of AU population); Online Harms White Paper and Furman Report (UK) – significance of the largest platforms). As a different but comparable benchmark, the recent DSM Copyright Directive provides for a lighter liability regime for start-up content sharing platforms as long as their average number of monthly unique visitors does not exceed 5 million (1% of the EU population).

Reaching 10% of the EU population (currently around 45 million people) directly, and many more indirectly through family members for example, represents a significant share of the EU population and can lead to a significant impact, regardless of the risks and harms considered. This value is set as a reasonable estimate for a significant reach in the EU prone to significant negative effects considering all societal risks in scope of this intervention. It is a proxy value, which is not tailored to the impact of a particular risks, such as the dissemination of a given type of illegal content or manipulation of democratic processes, but a cumulative approach. Its proportionality is considered also in relation to the horizontal measures and corresponding costs on service providers.

The benchmark for the EU-27 population has remained in a +/-5% fluctuation range since the 1990s. However, the legal technique for designing the precise threshold should take into account possibilities of more significant fluctuations.

Exploring available data – see below - all considered platforms with at least 45 million users in the EU are present in multiple Member States. Most of the very large platforms would be either social networks, online marketplaces or video-sharing services.

Using the number of users as the only criterion for the definition of very large platforms has clear regulatory advantages. It creates a simple, future-proof system where it is easy to determine whether a platform has a significant reach in the single market, which will ensure legal certainty. Information on the number of users is already widely available, though precise methodology and reporting is necessary for establishing legally reliable measurements.

When designing the threshold for very large platforms, alternative criteria were also considered:

a)Qualitative criterion of significant societal and economic impact – The collected evidence suggests that the largest online platforms all have significant impact on society and the economy. At the same time, this intervention is horizontal and considers different types of societal risks. Obligations imposed are due diligence, procedural obligations and the proportionality of the intervention in terms of costs on the service provider are considered in relation to the horizontal obligations and a general and cumulative assessment of societal risks, not individual risks for specific types of illegal content or societal harms. Such a case-by-case approach would lead to considerable legal uncertainty and disproportionate costs and long procedures for establishing the scope of the measures.. Also, these assessments would necessarily involve subjective elements and could lead to discrimination between service providers. The threshold regarding the number of users has been determined in a way that it implies potentially significant societal and economic impact.

b)SME status, turnover, market capitalisation – The reason to add such criteria would be to ensure that the enhanced obligations for very large platforms do not represent a disproportionate burden for a smaller company behind the platform. However, given the business model of large online platforms, it is highly unlikely that a platform with 45 million users would be a micro or small enterprise. In this unlikely and hypothetical case, the public interest objectives pursued by the initiative would outweigh the economic interest of the platform because the risks and harms are determined by the reach and impact of the platform, not the size of the company. In any event, the enhanced obligations for very large platforms have been designed to be proportionate for services of such scale.

The definition of ‘gatekeeper platforms’ in the Digital Markets Act (DMA) initiative is different in nature and scope from the definition of ‘very large platforms’ falling within the scope of the asymmetric obligations under the Digital Services Act (DSA). The DMA seeks to tackle primarily specific economic concerns associated with the gatekeeper power, which enables a small number of gatekeeper platforms to undermine fair commercial conditions and contestability of digital markets concerned. On the other side, the DSA seeks to address primarily societal risks, including some economic risks that are however very different to the ones related to the gatekeeper power, associated with the fact that some very large platforms represent de facto public spaces, playing a systemic role for millions of citizens and traders.

Irrespective of the different objectives pursued by the two sets of rules, there may be an overlap between these two categories. Very large platforms in the DSA are determined based on the number of their users. At the same time also in the DMA a provider of core platform services (i.e. online intermediation services; online search engines; operating systems; cloud computing services; and related advertising services to these core platform services) needs to have a minimum number of active users to be considered as a gatekeeper platform. However, contrary to the determination of very large platform under the DSA, the number of active users is just one of the criteria determining a gatekeeper platform. As the criteria will be different, not all very large platforms will be gatekeeper platforms.

Preliminary data used to estimate the scale of reach in the Union, is based on SimilarWeb extracted information (measured as average monthly users in 2019). The graph below presents average monthly users for a selection of online platforms based on the top ranking services.

The graph below shows cumulatively app users and browser users - it is important to note that the two user bases overlap to certain extent and this differs from one platform to another. To contextualise, . for Facebook, the actual user base in the EU is reported to be just under 400 million in the same period (2019); for Snapchat, most of the user base would be accurately represented by app users and is not an exhaustive list of platforms. App stores, for example, are not represented here.

This data suggests two important conclusions

(1)the differences in scale between the user base of platforms are staggering.

(2) there are methodological limitations in establishing with accuracy the scales of users of a digital service and an online platforms. Third party traffic data, such as the SimilarWeb source cannot accurately address duplication of measurements, and publicly available data on mobile vs browser use. To date, the most precise indications are reported by services themselves.

4.Macroeconomic impact analysis

At a basic level, economic impact analysis examines the economic effects that relevant business and/or economic events (infrastructure project or governmental policy, for example), have on the economy of a geographic area. At a more detailed level, economic impact models work by modelling two economies: one hypothesised economy where the economic event being examined occurred and a separate (real) economy where the economic event did not occur. By comparing the two economies, it is possible to generate estimates of the economic impact the event under analysis had on the area’s economic output, earnings, and employment. In many cases, sophisticated Computable General Equilibrium (CGE) models are used. In others, a simpler but equally robust analysis comes from an estimation method known as an input-output model. This is the method used in this case.

Input-output models are designed to examine all of the industries in an economy and estimate all of the ways that spending in one sector influences each of the other sectors in that economy. For example, what happens when an e-commerce website faces an increase in demand due to a government policy that addresses consumer protection? To meet the sales increase, the e-commerce website will procure more items from wholesalers or manufacturers. In turn, in order to increase production to meet the e-commerce demand, the manufacturer will need to hire more workers, as well as the logistics firms that distribute the items to the finals consumers, which indirectly increases total employment. However, the manufacturer will also need to purchase more raw materials and intermediate goods and services that are needed in the manufacturing process. As the manufacturer purchases more intermediate goods and services, the producers of those goods and services respond to the increase in demand by hiring more workers and purchasing more of their own inputs. Overall, the increase in e-commerce sales results in a direct increase in total employment caused by the website hiring more personnel to handle the increase in demand, as well as indirect increases in total employment caused by the other producers of goods and services involved in the value chain. Input-output models generate their estimates by examining three types of economic effects. The first effect is the direct impact of the spending or economic event. When a new business enters a city, it may employ 100 workers and sell €1 million in goods and services each year, which is the direct effect the business has on the local community. The business also has another effect on the community, called the indirect effect. In input-output modelling, the indirect effect is the impact the new business has on other local industries when it purchases goods and services for the operations of the business. In addition to the indirect effect, the new business or project also creates an induced effect within the regional economy. The induced effect is the result of the new employees and business proprietors spending the new income they are now receiving from the new business within the community. In the end, input-output models estimate the total economic impact new spending has on a local economy by combining the direct, indirect and induced economic effects. In this case, the figures underlying the estimation rely on the assumption that a revised policy for illegal content online will bring more certainty and confidence to users, which in turn will be translated in greater expenditure in e-commerce and more usage of other digital services. These assumptions are then translated to increases in expenditure and investment, a direct impact of the policy, while the total impact comes from the computation of the indirect and induced effects.

Input-output models, and economic impact analysis in general, are useful tools to estimate the effects new policy proposals, or changes in spending, will have within an economy. However, input-output models are based on a set of assumptions that need to hold for the results to be valid. One key assumption is that the new spending patterns are the same as the spending patterns made in the past. Another weakness of many input-output models is the assumption that inputs are infinitely available without prices having to increase. Finally, many economic impact analyses that use input-output models assume that the increased spending being modelled comes from outside the area the impact analysis examines, resulting in an increase in total spending. However, if the money is a simply transfer from one typo of expenditure to another, the total spending and employment in the city may not change at all.

Summary of the computation of the model:

Option

∆ GDP (B€)

% GDP (2019)

% benefit

1

Consumers

8.9

23.1

Providers

29.7

76.9

Total

38.6

0.3

2

Consumers

19.1

30.9

Providers

42.7

69.1

Total

61.8

0.4

3

Consumers

27.7

33.9

Providers

54.0

66.1

Total

81.7

0.6

Annex 5: Evaluation report for the E-Commerce Directive

1.Introduction

1.1.Purpose of the evaluation

This evaluation concerns Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market 34 (hereinafter “the e-Commerce Directive” or “Directive”).

The e-Commerce Directive, unchanged since its adoption in 2000, provides a horizontal legal framework for digital services 35 in the Internal Market by harmonising the basic principles and thereby allowing the cross-border provision of digital services. The Directive has been a foundational cornerstone for regulating digital services in the EU.

At the same time, the advent of internet and digital services revolutionised the everyday lives of Europeans in a way often compared to the industrial revolutions of the previous centuries. The digital technologies are profoundly changing European citizens’ daily life, their way of working and doing business, and the way they travel, consume cultural or entertainment content and communicate with each other.

Yet, it does not stop there. Digital technologies, business models and societal challenges are evolving constantly and with ever-increasing pace. The wider spectrum of digital services is the backbone of an increasingly digitised world, which incorporates a wider range of digital services, such as cloud infrastructure or content distribution networks. Online platforms like market places, social networks, or media-sharing platforms intermediate a wide spectrum of activities and play a particularly important role in how citizens communicate, share and consume information, how businesses trade online, and which products and digital services are offered to consumers.

Since the entry into force of the Directive, the Commission has gathered evidence indicating that the Directive has removed series of obstacles to the cross-border provision of digital services. 36 But recent evidence also shows that the Directive may not have fully achieved its objectives and that the issues relevant today, especially given regulatory, market and technological developments, may not all be addressed by the Directive.

The political guidelines of the President of the Commission announced her intention to put forward a Digital Services Act, to ‘upgrade our liability and safety rules for digital platforms services and products, and complete our Digital Single Market’. 37  

In its Strategy on Shaping Europe’s Digital Future 38 , the Commission announced that it intends to propose new and revised rules to deepen the Internal Market for digital services, by increasing and harmonising the responsibilities and obligations of digital services and, in particular, online platforms and reinforce the oversight and supervision of digital services in the EU.

In light of this, it is prudent and necessary to evaluate provisions regulating digital services in the Internal Market to assess whether they are still fit for purpose given the recent regulatory, market and technological developments in the last two decades.

In June 2020, the Commission therefore published combined Evaluation Roadmap/Inception Impact Assessment outlining its plan for the evaluation. 39 The purpose of the evaluation is to gather evidence on the functioning of the e-Commerce Directive, which will serve as a basis for the Commission to further define the problem analysis and the policy options and to compare their impacts in the Impact Assessment.

This evaluation systematically reviews and analyses all available evidence, from a variety of sources, which include information shared by the concerned stakeholders. It builds on detailed evidence gathered over the past years, in particular concerning the legal assessment of current implementation of the e-Commerce Directive and evidence of emerging legal fragmentation. In addition, it takes into account more granular data that is being collected regularly on specific types of illegal content and goods in the context of the structured dialogues and voluntary cooperation coordinated by the Commission on several policy areas. These area include unsafe products, illegal hate speech, child sexual abuse material (and related cooperation between law enforcement, hotlines and industry), counterfeit products, dissemination of terrorist content, amongst others.

Evaluation results will directly inform future policy decisions. They provide a starting point for a possible revision of the e-Commerce Directive.

This evaluation does not deal with the impact of the COVID-19 outbreak, given that these developments are very recent and the evidence gathered in the evaluation could not take them into account. Moreover, the duration and impact of the COVID-19 crisis cannot be predicted at the current stage, and it is therefore not possible to evaluate the effects of the COVID-19 crisis on the rules subject to the evaluation.

1.2.Past evaluations of the e-Commerce Directive 

Since the adoption of the e-Commerce Directive, the Commission adopted several policy documents concerning the evaluation of the e-Commerce Directive and more generally EU rules seeking to facilitate well-functioning internal market for digital services.

In its 2003 Evaluation Report 40 , the Commission concluded that the Directive has had a substantial and positive effect on e-commerce within Europe. Together with the Directive on transparency for information society services 41 , which establishes a mechanism allowing the Commission to assess draft national legislation as to its compatibility with EU law, it creates a straightforward internal market rules, which allows e-commerce to grow across national borders.

In its 2012 Communication on “a coherent framework to build trust in the digital single market for e-commerce and online services 42 , the Commission found that the principles and the rules of the e-Commerce Directive continue to be sound, but that some improvements were needed, in particular regarding the functioning of the notice-and-action systems. To this end, the Commission also organized a public consultation concerning procedures for notifying and acting on illegal content hosted by online intermediaries. 43

Finally, in its 2016 Communication on “online platforms and the digital single market opportunities and challenges for Europe”, the Commission found again that the principles and the rules of the Directive were sound. However, the Commission also observed the increasing importance of online platforms and identified several new risks that may lead to further fragmentation of the digital single market. To this end, the Commission adopted in 2017 the Communication on tackling illegal content online 44 , which was followed by the 2018 Recommendation on tackling illegal content online 45 .

1.3.Scope of the evaluation

The substantive scope of the evaluation includes the e-Commerce Directive in its entirety. Within this context, the evaluation specifically focuses on the following areas:

a.Functioning of the Internal Market for digital services, including functioning of the cooperation mechanism between the competent authorities of the Member States.

b.Liability of online intermediaries that manage content provided by third parties that use their services (e.g. internet service providers; cloud services; web hosts; online marketplaces).

c.Other measures setting the basic regulatory requirements for digital services in the Internal Market, in particular the ones concerning commercial communications and online advertising as subset of it.

The temporal scope of the evaluation covers the period since the adoption of the Directive in 2000.

The geographic scope of the evaluation extends to all EU Member States. 46  

As required by the Commission’s Better Regulation Guidelines, the evaluation examines whether the objectives of the e-Commerce Directive were met during the period of its application (effectiveness) and continue to be appropriate (relevance) and, whether the e-Commerce Directive, taking account of the costs and benefits associated with applying it, was efficient in achieving its objective (efficiency). It also considers whether the e-Commerce Directive as legislation at EU level provided added value (EU added value) and is consistent with other pieces of the EU legislation relevant for the provision of digital services in the Internal Market (coherence).

2.Background to the intervention

2.1.Grounds for the intervention 

The e-Commerce Directive is the legal framework for information society services in the internal market.

In 1990s and in the wake of the establishment of a well-functioning internal market, the Commission also considered it important to facilitate a growth of the electronic commerce that was providing a unique opportunity to create economic growth, a competitive European industry and new jobs.

Within this context, the Commission identified several obstacles to the potential economic growth that required attention: 47

I.Lack of legal certainty

The preparatory work pointed to significant differences in certain legal provisions applicable to information society services in different Member States. These differences meant that an information society service provider wishing to offer a service throughout the internal market had to comply not just with the legislation of a Member State in which it is established but also all other Member States to which it direct its activity.

In addition, several Member States were in the process of enacting new legislation, analysis of which showed difference in approaches and risk of fragmentation of the internal market. In particular, legal interventions at national level on liability of intermediary services considered instrumental for the exchange of views on line hampered the development of a rising use of online services, and detrimental for the free expression of views online.

II.Significant economic costs

The analysis at the time showed that the existing legal framework gives rise to significant costs for operators wishing to develop their activities across borders. The survey undertaken pointed to significant legal costs due to the differences in national legal regimes and need to comply with often very diverge national legal requirements.

III.The chilling effect on investment and competitiveness of the European companies

In view of the complexity of the legal framework and associated economic costs it has been considered that operators, particularly SMEs and microenterprises, who are unable to afford high-quality legal advice, are discouraged from exploiting the opportunities afforded by the internal market and investing in the European development of their businesses.

This was considered also a disincentive for investment in innovation and factor that could lead operators to design their services in a manner to meet the requirements of the most severe national legal requirements. This subsequently meant that some SMEs and microenterprises are less competitive than businesses with the funds to invest in an evaluation of the risks of securing access to the new market in electronic commerce while remaining within the law.    .

IV.The lack of confidence on the part of consumers

Finally, it has also been consider that consumers, and more generally, recipients of services may feel that they are in an unclear and vague situation with few guarantees as to the level of protection afforded under different national rules. They may therefore be unwilling to conclude on-line contracts and exploit new opportunities. or express their views online.

Beyond a general objective of establishing an internal market for electronic commerce there were two further drivers of regulatory changes. First, several reports at the time showed that Europe is lagging behind in particular the USA when it comes to the development of e-commerce and digital services. Second, penetration and use of internet has been growing.

2.2.Description of the intervention 

The approach of the e-Commerce Directive was to interfere as little as possible with national legal rules and to do so only where it is strictly necessary for the proper functioning of internal market. It has been considered at the time that the Directive does not need to cover complete areas of law and it can therefore target specific aspects.

In addition, the Commission considered that until an international regulatory framework is established, the Directive should only cover service providers who are established in a Member State. The Directive therefore did not cover information society service provided by a service provider established in a third country.

In practice, this meant that service providers who are not established in the Community could not exploit the opportunities afforded by the internal market. To do so, they would have to establish themselves in one of the Member States.

2.2.1.Information society services

The Directive applies to information society services, which encompass any service normally provided for remuneration, at a distance 48 , by electronic means 49 and at the individual request 50 of a recipient of services. 51  

Such services may include today:

·A general category of information society services: e-commerce websites selling any type of goods, online encyclopaedias, online newspapers, games, payment services, online travel agents, blogs etc.;

·In particular, a subcategory of information society services considered ‘online intermediaries’, ranging from the very backbone of the internet infrastructure, with internet service providers, cloud infrastructure services, content distribution networks, to messaging services, online forums, online platforms (such as app stores, e-commerce marketplaces, video-sharing and media-sharing platforms, social networks, collaborative economy platforms etc.) or ads intermediaries.

On the other side, the e-Commerce Directive itself clarifies that it does not apply to some areas and activities, the main ones being taxation, data protection, competition law, and gambling 52  activities.

The below figure provides a simplified overview of the taxonomy of information society services.

Figure 7: Information society services and scope of the e-Commerce Directive

Finally, since many digital services are provided to consumers free of charge, it is important to clarify that this in itself does not mean they would not qualify as information society services (e.g. service provider could be remunerated by advertising revenue) 53 .

2.2.2.Geographical scope of application of the Directive

The e-Commerce Directive applies to any information society service provider established in the European Union, but does not apply to information society services supplied by service providers established in a third country. 54  

2.2.3.Core elements of the e-Commerce Directive (i.e. core regulatory pillars)

Freedom to provide services (Article 3) and freedom of establishment (Article 4)

One of the core provisions of the e-Commerce Directive is the internal market clause 55 . It establishes that:

I.providers of “information society services” are subject to the law of the Member State of their establishment (“internal market principle”),

II.the Member State of establishment needs to ensure that the service comply with the national provisions applicable in the Member State in question which fall within the “coordinated field 56 , and

III.other Member States may only restrict information society services in very specific circumstances and pursuant to the procedure laid down in Article 3(4) of the e-Commerce Directive itself.

This means that as regards the rules covered by the “coordinated field”, the provider of information society services can “freely” offer its service across the single market by complying with the rules of the country in which it is established (hereafter “country of establishment”). In parallel, none of its host Member States (i.e. Member States where it provides its service; hereafter “country of destination”) can require the same service provider to comply with additional rules in this Member State. Thus, as a matter of principle the information society service provider cannot face any restriction from another Member State.

Exceptionally, on a case-by-case basis, a Member State of destination can adopt measures to derogate from the internal market principle under strict material (e.g. principle of proportionality; limited list of derogation conditions) and procedural conditions (i.e. notification obligation to the Commission and other Member States). 57

Furthermore, the single market clause does not apply to eight fields mentioned in the Annex of the Directive 58 .

The Directive also ensures the freedom of establishment, by prohibiting so-called prior authorisation regimes specifically and exclusively targeted at information society services in the Member State of establishment.

To address the need for smooth enforcement of the ‘coordinated field’ across jurisdictions, the Directive provides for a basic information and cooperation mechanism across national authorities, including a requirement for the appointment of one or more points of contacts in relation to the implementation of the e-Commerce Directive. Additional provisions on court actions, sanctions, and injunctions complement these core clauses.

Liability of intermediary service providers (Section 4 of the e-Commerce Directive)

The e-Commerce Directive, in Articles 12 and 13, harmonises the liability exemptions and in Article 14 liability limitations for so-called intermediary services. These range from ‘mere conduits’ like internet service providers ensuring the very backbone of the network, to ‘caching services’, and to ‘hosting services’ which are now understood to cover services such as web hosting, some types of cloud services, online platforms such as online marketplaces, app stores, video-sharing platforms or social networks.

The conditional liability exemptions and limitations cover all types of illegal activities and content, as defined in EU or national law, and provide intermediaries with safe harbour for all legal categories of liabilities, provided they meet certain conditions. Recently adopted Copyright Directive introduces a sector specific regime in this context.

For hosting services (covered by Article 14 of the Directive), the conditionality is two-fold: the provider can benefit from the exemption if it does not have actual knowledge about the illegal activity of content (or, in the case of claims for damages, awareness of facts of facts or circumstances from which the illegal activity or information is apparent), and if, upon obtaining such knowledge or awareness, it ‘acts expeditiously’ to remove or disable access to the illegal information.

The Directive also clarifies that courts and administrative authorities can require, in accordance with the Member States’ legal systems, a service provider to terminate or prevent an infringement if the law of the Member State concerned provides for such a possibility.

Article 15 of the e-Commerce Directive prohibits that Member States impose general monitoring obligations on online intermediaries or a general obligation to actively seek facts or circumstances indicating illegal activity.

Measures protecting users of information society services

The e-Commerce Directive lays down several measures that seek to protect users (e.g. consumers, business users, public authorities) by harmonising certain obligations, primarily concerning transparency requirements imposed on providers of information society services. Such examples of transparency obligations are:

·Obligation on information society service provider to make available its identity, name, geographic address, and details enabling rapid contact, and relevant registration information (in trade or similar registers), VAT number where relevant.

·Obligation to clearly identify commercial communications designed to promote directly or indirectly the goods, services or image of a company, organisation or person pursuing a commercial, industrial or craft activity or exercising a regulated profession as well as the natural or legal person on behalf of whom the commercial communication is made.

In addition, the e-Commerce Directive requires that Member States ensure that contracts can be concluded electronically, which means that they must remove legal obstacles which would:

·prevent the use of electronic contracts; and

·deny online contracts legal validity on the ground that they are formed by electronic means.

In this context, the Directive enshrines certain basic principles and transparency requirements as regards the conclusion of contracts by electronic means.

Finally, the Directive encourages the Commission and Member States facilitate the drawing up of codes of conduct at the Union level, by trade, professional and consumer associations or organisations, designed to contribute to the proper implementation of the e-Commerce Directive.

Mechanisms for effective cooperation between Member States and enforcement of the e-Commerce Directive

The e-Commerce Directive also lays down basic principles seeking to ensure effective cooperation between Member States and effective enforcement of the Directive, which is effectively to be carried out by the Member States.

To this end, the Directive envisages that any sanction in case of a violation of the e-Commerce Directive should be effective, proportionate and dissuasive. In addition, the available national court actions should be effective allowing for the rapid adoption of corrective measures, including interim measures.

The Directive also envisages and encourages cooperation and mutual assistance between Member States and with the Commission for the implementation of the Directive, in particular through the establishment of national contact points. Such a cooperation is particularly relevant in view of the envisaged close cooperation between country of origin and country of destination as regards implementation of the internal market principle laid down in Article 3 of the Directive.

Finally, the Directive encourages the use of alternatives enforcement instruments such as codes of conduct at the EU level or out-of-court dispute settlement schemes.

2.3.Objectives of the e-Commerce Directive

The general objectives of the Directive can be summarized as follows:

·Ensuring the freedom of providing digital services in the internal market, leading to growth and competitiveness in the EU; and

·Offering consumers a wide-range of choices and opportunities, including by ensuring that the Internet remains safe, trustworthy, fair and open.

At the same time, the specific objectives of the Directive can be summarised as follows:

I.Ensuring well-functioning internal market for digital services

·Its main objective is the proper functioning of the internal market for information society services. This is emanation of a principle of free movement of services as enshrined in the Treaty. It aims at ensuring the freedom to provide information society services and freedom of establishment for the providers of information society services within the single market. This aims to create a pro-competitive environment for business, also across borders, and to enhance choice, affordable products, services and content online and facilitate other opportunities for EU citizens.

This is achieved through the internal market clause 59 , which says that information society service providers are subject to the law of their home Member State (i.e. Member State in which they are established), and that other Member States (i.e. host Member States) can restrict their services only in exceptional circumstances 60 . It also establishes a notification and cooperation procedure with the Commission and Member States for those (urgent) cases where host Member States deem necessary to derogate from the provisions of the Directive.

·The prohibition of prior authorisation requirements 61 and the harmonisation of certain consumer-facing rules 62 throughout the Directive contribute to the objective of a well-functioning internal market for digital services.

II.Ensuring effective removal of illegal content online in full respect of fundamental rights

·For information society services acting as online intermediaries, the liability provisions of the ECD aim to establish a careful balance between the following objectives

1.to promote innovation on the internet, by shielding intermediaries that transmit or organise 3rd party content from disproportionate liability for each piece of content transmitted or hosted, and from general monitoring obligations related to the content they transmit or store;

2.to ensure the effective removal of illegal content by making the liability exemption conditional on knowledge, but leaving operators free to design their systems to address this objective;

3.to safeguard fundamental rights online, such as freedom of expression and right to privacy, by avoiding over-removal of (legal) content or surveillance, by limiting the scope of the liability provisions to illegal content and by banning general monitoring obligations.

III.Ensuring adequate level of information and transparency for consumers

·A number of provisions of the ECD aim to enhance trust in digital services. In particular, they aim to protect consumers and users not only against illegal content and activities, but also from lack of information and transparency when it comes to the nature and identity of the information society service provider, commercial communications (which are part of, or constitute, an information society services), unsolicited commercial communications, or certain pre-contractual or contractual obligations. The objective of promoting trust in online services is also achieved by acknowledging electronic contracts.

The following chart visualises the intervention logic – i.e. the way in which its main legal provisions are meant to contribute to the achievement of well-identified policy objectives - of the e-Commerce Directive.

Figure 8: Intervention logic of the e-Commerce Directive

2.4.Baseline

The baseline describes those developments (throughout the evaluation period) that could have been expected in the absence of the Directive. Any actual effective changes, attributable to the Directive, are measured against this hypothetical baseline scenario. This section describes the previous baseline assumptions of the original intervention and discusses whether any policy or market developments that have occurred since then have influenced these assumptions.

General outline

As shown above, before the introduction of the Directive, some Member States had adopted regulatory measures applicable to different aspects of the provision of information society services. But, where existent, many of these measures were diverging and they were undermining the well-functioning internal market, raised operational costs for service providers and served as disincentive for further investments as well as negatively impacted European competitiveness.

It was expected that absent the regulatory intervention the trend towards regulatory fragmentation would continue 63 , which would lead to further increased operational costs for service providers. It was considered that inefficiencies in the digital market would continue, possibly hampering the development of the internal market for information society services, limiting its innovation potential and have a deterrent effect on the competitiveness of the information society service providers.

Internal market principle

In the absence of the e-Commerce Directive the basic principles of the Treaty, in particular principle of free movement of services as enshrined in Article 56 TFEU, would have applied. This means that restrictions in each Member State of reception of information society service could be applicable, to the extent that they are justified based on an overriding reason of general interest (e.g. public policy; public health, but also tax coherence, protection of consumers), are proportionate and non-discriminatory. 64 Moreover, no coordination mechanisms as regards possible requirement for non-established information society service providers would have applied in those cases where the country of destination decided to restrict the provision of information society services cross-border.

As from 2009, the horizontal rules laid down in the Services Directive 65 would have applied for a wide (although not all) range of information society services, including Article 16 with regards to derogations to the freedom to provide services and the more general rules on administrative cooperation. This would have allowed each Member State of reception to make subject service providers established abroad to an open range of possible general requirements (except those explicitly banned) supported by any overriding reason of general interest. Moreover, as regards cooperation between Member States, each Member State where services are received would be entitled to restrict the provision of specific service provider for compliance with applicable national requirements, without obligation to consult and/or inform the Member State of establishment nor the Commission except in specific circumstances. Taking into account the potential accessibility of information society services from any Member State (and often the lack of specific registration to access services and/or limitation 66 of access from other Member States), this could potentially trigger liability for compliance with 27 different legal regimes and enforcement actions.

Liability of intermediary service providers

Before the adoption of the e-Commerce Directive, there were no harmonized EU principles as regards (exemption from) liability for intermediary service providers that provide certain digital service for third parties (e.g. access to the internet infrastructure; storage of information).

In the absence of the e-Commerce Directive - and before the adoption of some sector specific rules concerning liability of intermediary service providers at the EU level (see section 3.3 below and Annex 6 of the Impact Assessment for detailed information about these developments) – the question of possible (exemption from) liability would be governed by each Member States’ own legislation.

Since at the time of the adoption of the e-Commerce Directive several Member States have already adopted, or at least considered adopting, legislation in this area, it is very likely that the fragmentation of the rules in this area and therefore legal uncertainty would further increase. In addition, while it could be expected that some of the issues at the national level would have been submitted via preliminary ruling references to the CJEU, hence unifying interpretation, this in itself is unlikely to have significantly positive effects on tackling an increased trend of regulatory fragmentation.

For the internal market these divergences could be the source of further obstacles for the cross border provision of information society services (e.g. if a country of destination decides to disable access to information stored in the server of a service provider established in another Member State where the applicable liability regime is deemed to be unsatisfactory). In some Member States, such fragmentation may hinder activities such as the provision of hosting.

In the 2017, Communication and 2018 Recommendation the Commission clarified the e-Commerce Directive by laying down the soft law framework concerning tackling illegal content online. The objective of the two policy instruments was to improve the effectiveness and transparency of the notice-and-action process between the users and the hosting service providers, incentivize voluntary measures by hosting service providers, and increase cooperation between providers of hosting services and the specific stakeholder, such as trusted flaggers and public authorities.

Finally, this baseline regime of the e-Commerce Directive has been across the years complemented for a particular type of illegal material by sectoral rules and co/self-regulatory measures. Such rules and measures have been adopted in areas such as child sexual abuse material online, terrorist related content, audio-visual media services or copyright (see section 3.3 below and Annex 6 of the Impact Assessment for further details about these developments).

Protection of users of information society services

Before the adoption of the e-Commerce Directive there were already several user protection measures in place at the EU level. 67 Having said that, the preparatory work for the e-Commerce Directive pointed to several open questions both as regards possible rights of users as well as obligations of information society service providers when providing such services.

These issues in particular concerned the use of commercial communications that may in themselves constitute information society services or form part of it, including in relation to provision of regulated services, and the ability to conclude contracts by electronic means.

In the absence of the e-Commerce Directive it could be expected that Member States would continue with their legislative initiatives in relation to both sets of issues identified above, which would likely lead to further regulatory fragmentation, at least until further harmonization initiatives may have been adopted at the EU level.

Within this context, it should be noted that also that since the adoption of the e-Commerce Directive in 2000, the EU has been the strengthening and further harmonizing consumer protection, in particular with the adoption of the Unfair Commercial Practices Directive 68 in 2005 and the Consumer Rights Directive 69 in 2011. In 2019, both of these Directives were revised by the Omnibus Directive 70 to improve their enforcement and better adapt the protection of consumer in the digital age.

These Directives complement the e-Commerce Directive and ensure complementary protection of the users of the information society services when they act as consumers, i.e. for purposes outside their trade, business, craft or profession.

On the other side of the spectrum, the protection of business users has been strengthened through the adoption of the Platform-to-Business Regulation 71 in 2019. This Regulation imposes a series of transparency obligations in favour of the business users when dealing with providers of information society services offering intermediation or search services. It furthermore imposes the establishment of specific enforcement mechanisms such as internal complaint-handling system, mediation and collective actions.

Finally, the data protection rules, which have also been specifically referred to in the e-Commerce Directive, have also been revised and strengthened in 2016. The new General Data Protection Regulation 72 re-confirms the main rights of the data subjects of the previous regulatory framework and creates new ones, in particular the right to be forgotten, right to data mobility and right of explanation for automated decisions.

Mechanisms for effective cooperation between Member States and enforcement of the e-Commerce Directive

At the time of the adoption of the e-Commerce Directive, which introduced a specific cooperation mechanism in Article 3(4), there were no mechanisms in place to facilitate coordination between Member States when enforcing EU or national rules that may have an impact on the cross-border provision of information society services.

This uncertainty as to “who supervises what” was considered an important hindering factor for the development of the internal market and free movement of free movement of information society services. In particular, it was considered that it would be necessary to improve the level of mutual confidence between national authorities.

Since the adoption of the e-Commerce Directive, and its cooperation mechanism in Article 3(4), several additional sector and/or issue specific cooperation mechanisms have been set up since 2000. The main purpose of these was to facilitate the cooperation and mutual assistance between the competent authorities of the Member States in the specific areas concerned (e.g. dangerous goods; consumer protection). The most relevant in the present context are:

·The expert group on electronic commerce, which was set up in 2005 and is composed of the different national contact points and chaired by the Commission;

·The Consumer Protection Cooperation (CPC) Network, which was established in 2006 and is composed of the national consumer protection authorities;

·The rapid alert system for dangerous non-food products (i.e. Safety Gate), which was set up in and facilitates the rapid exchange of information between national authorities and the European Commission on dangerous products found on the market.

At the same time, the Internal Market Information (IMI) System, which is a multilingual secure online application to facilitate communications and support cooperation between the competent authorities of the Member States, has been set up as an underlying technical facility to support different cooperation mechanisms.

3.Implementation / state of Play

3.1.Market context and developments

Digital services have developed tremendously over the past 20 years since the adoption of the e-commerce Directive in 2000, becoming an important backbone of the digital economy and supporting fundamental societal digital transformations. The below figure in a simplified manner shows how some of the today’s most prominent digital services or business models were already there in 2000; however, the scale and impact of old and newly arrived services have expanded to all pores of the society.

Figure 9: Development of digital services (example)

The landscape of digital services is by no means static: it continues to develop and change rapidly along with the technological transformations and innovations increasingly available. For example, services providing the technical infrastructure for the internet are diverse and important for the development of various sectors, such e-commerce, connectivity, cloud services or advertising. The Court of Justice has not hesitated to apply the e-Commerce Directive provisions to some services (and business models) that did not exist when it was adopted.

The below table shows how widely different digital services, in particular different forms of online platforms, are used by the European citizens.

Table 1: Use of digital services by EU citizens

However, an important trend that is different from the beginning of the century is the increasing “platformisation” of the online space. While the rise of the “2.0” services, allowing users to publish, comment, buy and sell directly led to dis-intermediation of the traditional economy channels, the last decade has witnessed an important re-intermediation of the online economy. These intermediation services, widely known as online platforms are widely used in Europe; 76% of Europeans said in 2018 that they were regular users of video-sharing or music streaming platforms, 72% shopped online and 70% used social networks. In addition, more than 1 million EU businesses already selling goods and services via online platforms and more than 50% of SMEs selling through online marketplaces sell cross-border.

3.1.1.Increased exposure to illegal activities online

With such an exponential increase in the use of digital services and the opportunities for information sharing and electronic commerce, came also the increasing misuse of intermediary services for various types of illegal activities, such as:

·dissemination of illegal content, such as illegal hate speech, child sexual abuse material, terrorist content, IPR infringing content);

·illegal sale of goods, such as sale of dangerous goods, unsafe toys, illegal medicines, counterfeits, scams and other consumer protection infringing practices, or even wildlife trafficking, illegal sale of pets or protected species); or

·illegal provision of services, such as non-compliant accommodation services.

For example, for dangerous products, the Rapid Alert System for dangerous non-food products (Safety Gate/RAPEX) registers between 1850 and 2250 notifications by Member States per year 73 .I In 2019, around 10% were confirmed to be also related to online sales 74 , while the likely availability of such products online is very likely higher. In this regard, the COVID-19 crisis has also cast a spotlight on the proliferation of illegal goods online (e.g. products falsely presented as able to cure or prevent COVID-19 infections or bear false conformity certificates, etc.), especially coming from third countries.

Another example, for child sexual abuse material, the past few years have seen an increase in reports of child sexual abuse online concerning the EU (e.g. images exchanged in the EU, victims in the EU, etc.): from 23 000 in 2010 to more than 725 000 in 2019, which included more than 3 million images and videos. 75

To assess the size of the problem, the Commission ran a Flash Eurobarometer survey among a random sample of over 30 000 Internet users in all Member States, testing user perception of the frequency and scale of illegal activities or information online. The below figure shows most frequently seen types of illegal content per Member State.

Figure 10: Most frequently seen types of illegal content per Member States

In this context, it is important to note that not all types of illegal activities are appropriately addressed. For example, for certain types of illegal activities, the post-e-Commerce Directive adopted legislation, laid down a series of adapted obligations on online intermediaries, defining the specific responsibilities in areas such as:

·child sexual abuse material; 76

·terrorist offences online; 77

·copyrighted content; 78  

·explosive precursors; 79

·other types of illegal products subject to market surveillance 80 ; or

·for the specific case of audiovisual content on video-sharing platforms, the Audiovisual Media Services Directive 81 , currently being transposed by Member States.

The respondents to the open public consultation referred to different types of illegal and harmful activities and information that they perceive are increasingly exposed to.

The main issues reported by the respondents in relation to goods are: deceptive advertising especially on food, food supplements and drugs, also COVID related, advertising on pet and wildlife trafficking or counterfeit and defective (and even stolen) goods, electronics and clothes.

Regarding services, the main issues raised by the respondents are: fake event tickets or cases in which platforms illegally re-sell tickets and inflate their prices, cryptocurrencies and trading online or general cases of phishing.

Finally, in relation to content, the respondents report significant issues related to hate speech (e.g. racism, anti-Semitism, white supremacy, calls against migrants and refugees, extremism, far-right propaganda, homophobia, sexism, defamation), general incitement to violence, unwanted pornography and prostitution ads, child sexual abuse material, IP infringement for movies and copyrighted other content or political disinformation and fake news.

The vast majority of users that replied to the open public consultation are not satisfied with the actions that platforms take to minimise risks for consumers to be exposed to scams and other unfair practices. The users mostly consider that platforms are doing very little and not enough to prevent these issues from happening.

For some categories of illegal activities, such as hate speech, dangerous products or counterfeits, the Commission has facilitated self- and co-regulatory efforts in cooperating with national authorities and/or trusted third parties to address concerns identified.

Yet many categories of illegal content, goods or services are outside the scope of such interventions and there is no set process for tackling them.

The only horizontal document addressing all types of illegal activities horizontally is the Commission’s Communication from 2017 and, as a non-binding legal act, the Recommendation of 2018, which sets guidelines for all hosting services for any type of illegal activity in efforts to curb illegal activities online. However, this instrument and measures identified therein are only selectively applied by some hosting service providers and by Member States.

3.1.2.Lack of information or awareness for addressing other risks online

Since the adoption of the e-Commerce Directive, the volumes of information and commercial offers available online have increased tremendously, resulting in some information society service providers (e.g. online platforms) becoming important players in the ‘attention economy’. They increasingly not only intermediate access to information and business offers, but also optimise the discoverability of the most relevant information for each of their users individually. Today, there is virtually no online service, website or app that does not make some decisions on what they consider relevant to each of their users, and that defines criteria for matching the information they present to their users. This includes ranking systems on embedded search functions (or on search engines), recommender systems, and, indeed, more or less complex advertising placement services, including micro-targeting.

Where wide audiences can be reached, potential negative effects of such information amplification systems are more prominent. These negative effects may be manifold, reaching from amplification of illegal content through such systems to the amplification of content, which is not per se, illegal, but may be harmful 82 .

During the open public consultation, users expressed mixed views as regards the understanding of whether they know why certain content or products is recommended to them. Some consider that it is hardly impossible to understand why a certain product/content is addressed to them, while others consider that what they see is related to other products they bought, searches done on the platform and on the web (cookies). Users are unhappy about the fact that they are not provided with information on their behaviours that are tracked on the web and how their data is used to build recommendation algorithms.

Furthermore, several digital users’ associations have pointed to the fact that, beyond the hosting of illegal content, the actual problem is the dissemination of it through algorithms predicated on increasing platform engagement, not the health, safety, and wellbeing of the user. Algorithms seem to promote content with a high level of engagement and often disregard the fact that this content might be inciting violence or misinformation.

While the reflections and evidence on the extent of the possible issues and harms is evolving, there are several problems cutting across such systems:

·Users lack meaningful information about how these systems function and they do not have any possibility to influence them.

·There are very few ways of researching and testing the effects of such systems. Most of the evidence and information about harms relies on the investigations and willingness cooperate of information society service providers themselves.

3.2.Transposition and implementation of the Directive

3.2.1.General outline

The e-Commerce Directive entered into force on 8 June 2000 and the deadline for its transposition was 17 January 2002.

Whilst compliance with the Directive’s requirements are to be primarily controlled by the competent national enforcement authorities, the Commission has monitored on a regular basis the transposition and application of the Directive by individual Member States (see also section 1.2 above for information about past evaluations of the Directive).

The Commission’s experience with the implementation of the e-Commerce Directive shows that the majority of the Member States have largely literally transposed the provisions of the Directive itself and to date there were only few cases where the Commission was required to assess the compliance of the national implementing measures with the e-Commerce Directive. None of these cases led to a referral of a Member State in question to the Court of Justice for non-compliance with the e-Commerce Directive.

Having said that, in particular the experience from the notifications of national legislative measures under the Transparency Directive, points to an increasing number of national measures that result in legal fragmentation of the rules applicable to information society services providers in the internal market, raise questions of compliance and hinder the cross-border provision of information society services.

This concerns in particular the compliance with the internal market principle laid down in Article 3 of the e-Commerce Directive, which was one of the main elements of comments of the Commission in the notifications applicable to information society services of national measures under the Transparency Directive. It also concerns compliance of increasing number of national legislative measures with Article 14 of the e-Commerce Directive.

3.2.2.Extraterritorial application of national laws and fragmentation of the internal market for information society services

The Commission observes in the last few years, in particular through the notifications on national measures applicable to information society services under the Transparency Directive, an increasing trend of the regulation of information society services in Member States. This is mainly true as regards the duties and obligations for online platforms to address content hosted in their services that would be illegal under national law. 83  

Some of the recent national measures adopted by Member States in this regard aim to apply to any provider of hosting services with a distinctive presence in their national territory, irrespective of its place of establishment. This means that under these national laws the country of destination would also be competent to supervise compliance of the relevant services with the applicable national rules and obligations including, where foreseen in the law, to impose cross border sanctions.

Member States have justified the adoption of national laws with cross-border application on the need to protect their citizens against the rise of illegal content being intermediated on hosting services. They claim the regime set out in the e-Commerce Directive, and in particular the available derogations from the internal market principle, do not cover these practices or is not sufficient to ensure the protection of their national users in view of the realities of the online environment. 84

In the absence of harmonized obligations for online platforms to address this issue, this situation is prompting Member States to put forward new initiatives aimed at protecting their citizens from illegal content online. Regardless of the legitimacy of the policy goal, the extraterritorial application of most of these national measures to online platforms established outside the concerned Member States adds to the existing legal fragmentation in the internal market.

Respondents to the open public consultation point to several issues, which are stifling the development of internal market for digital services such as legal fragmentation and definitional vagueness, jurisdictional conflicts or lack of regulatory consistency.

3.2.3.Cooperation between Member States and lack of clarity on the use of appropriate cooperation mechanism

General outline

The evaluation shows that the competent authorities have difficulties in supervising information society services, in part because they lack the necessary data and information, in part due to a lack of capability and technical resources. The evaluation points to several issues:

·First, the experience points to instances of lack of cooperation and trust among authorities and assistance mechanisms provided by the E-Commerce Directive are underutilized by Member States (see analysis further below). In some instances, Member States preferred the avenue of national legislation fuelling the legal fragmentation with important costs on service providers and an unequal and inefficient protection of European citizens, depending on the Member State where they reside.

·Second, authorities lack data and information, as well as means to gather such evidence, and lack technical capability for processing and inspecting technically complex services. Similarly, they lack means for supervising the underlying activity intermediated by online platforms. For example, in the area of collaborative economy in the accommodation sector, complaints from cities mainly relate to access to data requests which often go unanswered by online platforms that facilitate interaction between provider of an accommodation services and consumer. These are often refused on the basis of GDPR or are not satisfied due to the inefficient cooperation mechanism with the country of origin. Finally, aggregate data that these online platforms may be providing or publishing do not address Member States’ need for specific individualised data.

·Third, several authorities within each Member State are responsible for supervision of the different aspects of the information society services. In the targeted consultation of Member States, eight of them pointed to the multiple mechanisms for sending and receiving requests for investigation in various areas such as consumer protection or audiovisual content, and the need for clarity and ensuring timely cooperation within and across instruments (see further analysis of the issue below).

·Fourth, the evaluation shows that the competent authorities often have very few, if any means, to intervene when services are established outside the EU, while they can easily be used by the European consumers.

Functioning of the existing cooperation mechanisms

Compared to the 2012 e-Commerce Directive “implementation report 85 the application of the internal market principle, the features of the cooperation mechanism and the effects of notification have been subject to some developments.

First of all, pursuant to Article 29(3) of the IMI Regulation 86 a pilot project has been launched since 2013 with a view to evaluate the use of the IMI information system as an efficient, cost-effective and user-friendly tool to implement Article 3(4), (5) and (6) of the c-Commerce Directive. Since 2013 requests to take measures from authorities of country of destination to the country of origin of the service provider, as well as those to the Commission and country of origin notifying the intention to adopt measures derogating from the internal market principle in view of insufficient or lack of measures by the country of origin, are normally channelled through IMI. This pilot project aimed at ensuring a comprehensive platform for notifications between Member States and the Commission, even if few individual cases have been reported where notification has been done through other means, as this tool is not specifically mandated in the e-Commerce Directive.

Within this context the trend identified in the 2012 e-Commerce Directive “implementation report”, showing a very low number of notifications (approximately 30 in the first 9 years), partially evolved, even if the number remain low compared to the extent of cross-border on-line activities 87 .

Between 2013 and July 2020, 111 notifications have been filed with the Commission, with a request to derogate from the internal market principle 88 .

Figure 11: Number of IMI notifications

Still, the use of the platform appears quite concentrated with a handful of Member States having used it and an overwhelming number of notifications originating by only two Member States (Italy and, at the time, the United Kingdom, the latter only concerning value-added phone services).

Figure 12: Number of IMI notifications per Member State

In the majority of cases (57), moreover, the urgency procedure is activated, in spite of the fact that this should be used only in exceptional circumstances. All notifications, moreover, are justified on the basis of the protection of consumers (only in a couple of cases accompanied with protection of health), for which also another cooperation mechanism is available for the enforcement of EU consumer protection legislation under the Consumer Protection Cooperation (CPC) network, whose new provisions 89 became applicable as from January 2020 and whose cooperation mechanism is, as from 2020, also hosted by the IMI platform.

Finally, no decision has been adopted by the Commission so far as regards the measures adopted, taking also into account that these are normally very much linked to the specific facts at stake. It is not clear, however, whether a relatively low number of notifications reflects a very limited number of cross-border issues or rather an under-utilisation of the tool by some or all authorities in different Member States.

Surveys among the competent authorities in the context of the evaluation of the pilot ECD-IMI project show that awareness and utilisation of the tool is very different among Member States and, within Member States, among different competent authorities. Out of 26 Member States replying to the survey in 2019, 10 never used the tool; moreover, a majority of responding MS (11), while supporting the use of IMI, suggested to improve support and awareness of the system.

More generally, some issues for clarifications are highlighted by some Member States participating to the survey, and in particular the interrelationship with other cooperation systems, and in particular the CPC cooperation network, as well as the kind of measures to be notified and the interrelationship with other notification systems (such as TRIS). Moreover a majority of responding Member States (16) expects that the current practice of national authorities will change, following the recent ruling of the Court of Justice in the AirBnB case.

The lack of notification under Article 3 of the e-Commerce Directive have also been recently clarified by the Court of Justice in the context of the Airbnb case 90 (C-190/18), where the court stated that “an individual may oppose the application to him or her of measures of a Member State restricting the freedom to provide an information society service which that individual provides from another Member State, where those measures were not notified in accordance with that provision”. This hence provides for the non-enforceability of measures where Member States failed to notify them according to Article 3(4) of the e-Commerce Directive.

At the same time the Court, while confirming that the notification is due also for provisions predating the e-Commerce Directive, did not clarify which and when measures are to be notified, nor the interrelationship with other notification systems such as that provided for by the Transparency Directive. While some aspects may be further specified in the forthcoming judgement on on-line pharmacies 91 , currently the e-Commerce Directive does not provide any indication.

During the meeting of the e-Commerce expert group in October 2019, the issues of cooperation and use of IMI were discussed as well. Despite the differences in the use of IMI, Member States widely expressed the need to have a functioning, strengthened but also simple cooperation mechanism in the future, as this is important to ensure public interests in cross-border issues.

In the context of the evidence gathering for the purposes of the present evaluation the Commission sent also a targeted questionnaire to Member States enquiring about the national experiences on the e-Commerce Directive in the wider framework of challenges and opportunities brought forward by the evolution of digital services.

Overall, 21 replies from 17 MS (in one Member State 5 authorities replied) were received. Concerning the functioning of the cooperation mechanism and the COO enshrined in the ECD, different aspects are stressed, taking also into account that a number of authorities (7) did not report direct experience of the system in sending and/or receiving cooperation requests.

A number of Member States expressed dissatisfaction with the average timing or quality of the feedback (ES, LV, AT, DE) received by other authorities. The cooperation was considered to work better in issues harmonised by EU law (consumer protection, transparency requirements). Some Member States reported concerns about the use of the system in the application of national requirements, for which the country of origin might not have corresponding powers to enforce the request.

Eight Member States highlighted the parallel use/existence of specific cooperation systems alongside that of the e-Commerce Directive, for both sending and receiving requests of investigation (e.g. CPC Network).

According to some Member States, the low number of cooperation requests is explained by the low awareness of the system (EL) but also by the well-functioning system of injunctions/N&A, ensuring removal of illegal content by the provider directly (LU). Few MS (DE, AT) indicate in any case an increasing trend of cross-border issues, in particular as regards content.

In view of the significant consequences of the failure to notify on the individual acts restricting the provision of information society services, therefore, it is sometimes still uncertain to what extent the existing cooperation mechanism provided for under the e-Commerce Directive ensures the necessary legal certainty and transparency for all parties involved about the compliance with such requirement.

During the open public consultations several stakeholders expressed their view on the question of cooperation among national authorities.

Trade associations, digital users’ associations and companies consider that cooperation should be improved significantly both between Member States and between different authorities within each Member State. In addition, the quality of intervention varies greatly between authorities and there is often a need for more capabilities and resources.

Content creators and right holders are concerned with the fact that, while copyright is largely harmonised across the EU, there is no system in place for national authorities to cooperate on the enforcement of those rights. They state that “cooperation mechanism for cross-border cases established in the e-Commerce Directive does not function in practice” and that “the 2004 IPRED Directive, as currently implemented by different EU Member States, varies tremendously and leads to a lack of clarity”.

Several national authorities consider that the quality of cooperation is good (Spain, Malta, Greece, Italy, Finland, Hungary, Portugal, France, Austria). Some point out to some issues and room for improvement (Sweden, Belgium, the Netherlands, Ireland). The Netherlands Authority for Consumers and Markets points out that cooperation could be improved and that the new CPC regulation entered into force from January 2020 is considered a potential important improvement on the EU enforcement of consumer protection rules. The Belgian government points out the need for better cooperation for tackling and preventing the dissemination of illegal content online.

3.2.4.Fragmented national laws applicable to hosting service providers

National implementing measures range from an almost literal transposition of Article 14 e-Commerce Directive, without any further clarification on the obligations for hosting services, to stricter and more detailed rules on the systems to be put in place by such services to remove or disable illegal content. The lack of a harmonized system to trigger “actual knowledge” has been understood by some Member States as pointing to a so-called “notice-and-takedown” system.

In this context, nine Member States (Finland, France, Germany, Greece, Hungary, Italy, Lithuania, Spain and Sweden) have implemented a notice-and-action procedure in their legislative frameworks. For five of them (Finland, Greece, Hungary, Italy and Spain) this only applies to copyright infringements and related rights thereof.

Furthermore, in a few Member States (Finland, France, Hungary, Lithuania), minimum requirements for the notice are defined within law, to ensure that it is sufficiently motivated. In Member States without statutory requirements for notices, the case law has provided indications concerning the content of the notice and the mechanism.

Recently, the Commission commissioned and published an external study to look into the different regimes adopted by Member States in their transposition of Article 14 of the e-Commerce Directive. 92 The findings of the study point at a clear national fragmentation in the national legal mechanisms adopted. 93

The information available in the study shows that the majority of Member States have followed an almost verbatim transposition of Articles 14 of the e-Commerce Directive. Having said that, some Member States have provided for specific liability exemptions for information location services (search engine services) and hyperlinking services. For example, Austria, Hungary, Spain and Portugal have adopted specific liability exemptions for search engines according to which a company can benefit if it meets the conditions that hosting service providers are required to meet in order to secure a liability exemption. Similarly, Austria, Spain and Portugal have adopted liability exemptions for hyperlinks applying the same conditions as the Directive's liability exemption for hosting activities. 94  

Among the rest of Member States, not only do Member States have different options as to contemplate notice-and-action procedures and minimum requirements for notices, but also with regard to when ‘expeditious removal’ occurs, what is understood by ‘knowledge’ and what specific provisions Member States have in terms of safeguarding the freedom of expression.

There are different interpretations amongst Member States’ national laws as to the exact conditions under which a hosting service provider is deemed to have actual knowledge of the illegal activity or information stored by a third party. Most Member States leave it to be decided by national courts on a case-by-case basis. The open public consultation has shown some uncertainties as to the application of Article 14 ECD to hosting services; national courts have taken divergent stances whether these services must be regarded as hosting activities within the meaning of Article 14 ECD or not.

Also, some Member States require a declaration of illegality from a competent authority or limit it to ‘manifestly illegal content’. For example in Romania, the hosting service provider must have ‘knowledge of the fact that the activity or information is illegal’ when its illegal character was witnessed by a decision of a public authority.

Finally, the removal or disabling access to a certain content can have a negative impact on the exercise of the rights to freedom of expression and information. It is therefore important that content providers, as also stipulated in 2018 Commission Recommendation on tackling illegal content online, offer an opportunity to submit a counter-notice to defend the legality of the information at issue.

However, the analysis of the existing situation shows that there are again differences between Member States. In 13 Member States, some form of opportunity to dispute the allegation exist. Yet, the situation and conditions in which counter-notices are possible differ greatly amongst Member States. For example, a counter-notice in Estonia is only possible when the removal order is ordered by a government agency; in Finland, Greece, Hungary, Ireland, Italy and Spain counter-notices are only possible in the context of copyright; and in Luxembourg, it is only possible during the merit procedure.

In eight Member States (Bulgaria, Estonia, France, Germany, Greece, Lithuania, Portugal and Sweden), some sort of alternative dispute settlement mechanism exist. For example in Portugal, there is an out-of-court preliminary dispute settlement possible in case the illegality of the case is not obvious; in Estonia, a specific alternative dispute regime exists for copyright infringements, in which a specific committee can resolve disputes.

3.2.5. Lack of clarity and transparency on content moderation activities

The evaluation also shows that some information society service providers, in particular larger online platforms, set the rules of the game on their services that however have a wider societal impact. They not only set their own content and market policies and enforce them, but also choose what to report on and to whom as well as what information to give to their users.

Only 2% of the respondents (among those that provided reply to the relevant question) to the open public consultation state that they were informed by the platform before their content/goods/services were removed or blocked. Most of them were not able to follow-up on the information. In addition, the vast majority of users were not informed after they provided a notice to a digital service asking for the removal or disabling of access to contents/goods/services (only 13% were informed, 21% were informed in some occasions and 66% were not informed at all).

There are several aspects in the opacity and lack of accountability of online platforms:

First, users lack effective ability to:

·Report illegal activities they are witness or subject to on a particular service and to follow actions taken as a follow-up.

·Seek redress when their content is taken down, to be appropriately informed of the rules and measures taken by service provider.

·To clearly understand how information, services and goods are prioritised, on what grounds and what choices they have at hand.

·Know and understand when being presented with ads, in particular when they are being profiled and targeted.

Second, users - citizens, but also small businesses and organizations using very large platforms - cannot be sole responsible for ‘supervising’ such complex and impactful systems. At the core of the matter there are large information asymmetries and there are only very limited means for researchers, civil society or other third parties to inspect or understand platforms’ systems, in particular where algorithmic tools are used.

Finally, authorities very often lack sufficient information to appropriately supervise information society services.

3.3.The legislative developments outside the e-Commerce Directive

Since the adoption of the e-Commerce Directive in 2000, several new pieces of EU legislation applicable to information society services have been adopted. These legislative measures cover various aspects of the provision of information society services in the internal market.

Some of the most relevant examples of such legislative measures 95 are:

·Directive 2019/790 on copyright and related rights in the Digital Single Market (the “Copyright Directive”), which introduces a new conditional liability regime for online content sharing services.

·Directive 2018/1808 amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (“AVMSD”). The AVMSD requires that Member States ensure that video-sharing platform services take appropriate measures to protect minors from harmful content, and to protect the general public from illegal hate speech and content whose dissemination constitutes a criminal offence under Union law as well as measures to ensure compliance with commercial communications requirements under the AVMSD.

·Directive (EU) 2017/541 on combating terrorism, which requires Member States to take the necessary measures to ensure the prompt removal of online content constituting a public provocation to commit a terrorist offence. In addition, a Regulation specifically addressing the obligations of online hosting service providers with regards to terrorist content disseminated by their users was proposed in 2018 and is currently under negotiation between the co-legislators. 96

·Regulation 2019/1020 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011. This Regulation for example requires information society service providers to cooperate with the market surveillance authorities, at the request of the market surveillance authorities and in specific cases, to facilitate any action taken to eliminate or, if that is not possible, to mitigate the risks presented by a product that is or was offered for sale online through their services.

·Directive 2011/93/EU on combating the sexual abuse and sexual exploitation of children and child pornography (“CSAM Directive”). This Directive obliges Member States to take the necessary measures to ensure the prompt removal of web pages containing or disseminating child pornography hosted in their territory and to endeavour to obtain the removal of such pages hosted outside of their territory.

·Unfair Commercial Practices Directive (“UCPD”). 97

·Consumer Rights Directive (“CRD”). 98

·Directive (EU) 2019/770 on certain aspects concerning contracts for the supply of digital content and digital services, 99 which lays down common rules on certain requirements concerning contracts between traders and consumers for the supply of digital content or digital services.

While majority of the legislation adopted post-e-Commerce Directive also lays down the relationship between the different sets of rules, it is important to note that several stakeholders, including Member States, raise the question of interplay between different set of rules or, as shown further below, different cooperation mechanisms (see section 3.2.3 below).

3.4.The developments of the case law

The role of the CJEU in interpreting the provisions of the e-Commerce Directive has been instrumental both in view of the significant developments since its adoption as well as many open questions about the relationship between digital services and underlying services that some of these facilitate. Preliminary ruling references have been the key source of more than 20 preliminary ruling judgements concerning the interpretation of the e-Commerce Directive. Conversely, no case has been brought to the Court by the Commission in its capacity of the guardian of EU law for a possible infringement of it.

As shown above, the digital markets and services have developed significantly since the adoption of the e-Commerce Directive and appearance as well as disappearance of many services that have not existed at the time of the adoption of the Directive could have been observed. Unsurprisingly, several notions and principles of the e-Commerce Directive have therefore been subject to an interpretation of the Court of Justice, as shown further below.

Definition of information society services

Since the e-Commerce Directive applies to the information society services its precise meaning is essential for qualification of a specific service.

While an information society service should normally be provided for remuneration, the Court of Justice clarified in Papasavvas and Mc Fadden cases that “the information society services does not have to be paid by the recipient of the service (and can be free for her) but the service can be paid with income generated by advertisements". 100

In Ker-Optika the Court clarified that “activities which, by their very nature, cannot be carried out at a distance or by electronic means, such as medical advice requiring the physical examination of a patient, are not information society services, and consequently, do not fall within the scope of that directive.” 101

Furthermore, the question of the legal qualification of a specific service is becoming increasingly important in the context of the collaborative economy to determine whether the platforms providing collaborative services could be considered as providers of information society services.

For example, in Elite Taxi 102 case the CJEU held that “an intermediation service that enables the transfer, by means of a smartphone application, of information concerning the booking of a transport service between the passenger and the non-professional driver who will carry out the transportation using his or her own vehicle, meets, in principle, the criteria for classification as an ‘information society service’.”

However, the Court held in that specific context that the intermediation service must be regarded as forming an integral part of an overall service whose main component is a transport service and must therefore be classified as ‘a service in the field of transport’ and not ‘information society service’. The Court reached this conclusion based on the following factors:

·Uberpop 103 provided drivers with an app which if it was not used, the transport service would not have taken place; and

·Uberpop exerted a decisive influence over the conditions under which the transport service was provided by setting the fare, controlling the quality of the vehicles or setting minimum safety standards

In another case concerning the relationship between intermediation accommodation platform and providers of accommodation services, the Court reached a conclusion that intermediation services such as those provided by Airbnb cannot be regarded as forming an integral part of an overall service, the main component of which is the provision of accommodation. 104  

The Court notably held that Airbnb Ireland did not exercise a decisive influence over the conditions for the provision of the accommodation services to which its intermediation service relates, particularly since it:

·Did not determine, directly or indirectly, the rental price charged; nor

·Did it select the hosts or the accommodation put up for rent on its platform.

On the other hand, in a case about the regulation of short-term letting of furnished premises, the Court subjected the provision of such services to the rules set out in the Services Directive. In this way, the Court clearly distinguished the provision of the offline accommodation services from the online intermediation service. 105

Internal market principle

The internal market principle as an important pillar of the e-Commerce Directive has also been subject to several important preliminary ruling reference judgements since 2000.

In eDate Advertising case, the Court held that according to the internal market clause Member States must ensure that, “in relation to the ‘coordinated field’ and subject to the derogations authorised, the provider of an information society services is not made subject to stricter requirements than those provided for by the substantive law applicable in the Member State in which that service provider is established”. 106  

In Cornelius de Visser case, the Court held that “Article 3(1) and (2) of e-Commerce Directive does not apply to a situation where the place of establishment of the information society services provider is unknown, since application of that provision is subject to identification of the Member State in whose territory the service provider in question is actually established.” 107

In a recent case on on-line sale of medicines without prescription, the Court held that “a Member State of destination of an online sales service relating to medicinal products not subject to medical prescription may not prohibit pharmacies that are established in another Member State and sell such products from using paid referencing on search engines and price comparison websites.” 108  

As regards the scope of the ‘coordinated field’ to which the internal market clause applies, the Court held in Ker-Optika case that “the coordinated field covers the online selling of contact lenses but does not cover the physical supply of contact lenses as the former is online while the latter is not”. 109 Furthermore in Vandenborght case the Court decided that “the coordinated field covers a national law imposing a general and absolute prohibition of any advertising relating to the provision of dental care services, inasmuch as it prohibits any form of electronic commercial communications, including by means of a website created by a dentist”.

Finally, in as much as the derogation clause in Article 3(4) of the Directive is concerned, the Court in Airbnb Ireland case that “if a Member State takes measures that derogate from the principle of the freedom to provide information society services without complying with the procedural conditions of the e-Commerce Directive (in particular, the notification to the Commission and the other Member States), those measures cannot be applicable against such provider of an information society service”.

Liability of intermediary services providers

Since the adoption of the e-Commerce Directive there were several cases dealing with an interpretation of the provisions dealing with the liability safe harbour for intermediary service providers. Large majority of these cases came from the area of intellectual property rights.

Several points in the case law are relevant in this context:

·Services that can benefit from the liability safe harbour: the case law clarified that a number of services can qualify for one of the safe harbours, such as a social network 110 , an online marketplace 111 , keyword advertising service 112 , internet access providers 113 or “provider” of a Wi-Fi network 114 . This is a particularly important point having in mind that many of the services that exist today have not existed at the time of the adoption of the e-Commerce Directive, or have at least not existed in the current format. Such services include for example content delivery networks (“CDNs”), virtual private networks (“VPNs”), Infrastructure as a Service (“IaaS”) or Platform as a Service (“PaaS”).

·Existence of an actual knowledge about illegal information: the case law clarified that the e-Commerce Directive does not harmonise the procedures for acquiring knowledge, but it requires hosting providers to behave as diligent economic operators. 115 It also clarified that Article 14 of the Directive requires knowledge about illegality of information, and not just its existence. 116 Finally, the Court also clarified that the actual knowledge can be obtained by means of a notification that is “sufficiently precise or adequately substantiated”. 117

·Scope of the hosting safe harbour: the case law clarified that the scope of the safe harbour depends on the distinction between active or passive role that the intermediary service provider may play in relation to the content provided by the third party. An important question is to establish whether “an operator has not played an active role allowing it to have knowledge or control of the data stored”. 118 For example, an operator of an online marketplace “provides assistance which entails, in particular, optimising the presentation of the offers for sale in question or promoting them”. The Court also clarified that “the mere fact that the operator of an online marketplace stores offers for sale on its server, sets the terms of its service, is remunerated for that service and provides general information to its customers cannot have the effect of denying it the exemptions from liability. 119

·Prohibition of a general monitoring obligation: the case law clarified that abstract non-targeted filtering which was requested by a court against a social network and an internet access provider is prohibited under the e-Commerce Directive. 120 On the other hand, the Court clarified as well that a national court can impose, within the limits of “specific monitoring obligations”, determined remedies such as:

I.measures against repeated infringers by a trading platform; 121

II.disabling access to a specific website by an internet access provider; 122

III.protecting open Wi-Fi network by a password; 123  

IV.injunction extended to information, the content of which, whilst essentially conveying the same message, is worded slightly differently, because of the words used or their combination, compared with the information whose content was declared to be illegal. 124

4.Evaluation questions

This evaluation assesses the e-Commerce Directive against the five Better Regulation criteria, namely effectiveness, efficiency, relevance, coherence and EU added value, using the specific evaluation questions for each of them.

4.1.Effectiveness

1.Has the e-Commerce Directive attained its initial objectives?

2.What gaps have been identified?

3.To what extent has legislative developments in recent years been able to contribute to the attainment of the objectives of the e-Commerce Directive?

4.2.Efficiency

4.Are the costs of the e-Commerce Directive proportionate to the benefits that the Directive brings for stakeholders?

4.3.Relevance

5.How well, if at all, do the objectives of the e-Commerce Directive still correspond to the needs?

4.4.Coherence

6.Is the e-Commerce Directive coherent with other EU legislative instruments that apply to information society services?

4.5.EU added value

7.What is the added value resulting from the e-Commerce Directive compared to what could be achieved without such intervention?

5.Methodology

5.1.Short description of methodology

The evaluation of the e-Commerce Directive started in June 2020 with the publication of the joint roadmap/impact assessment. The Inter-service Steering Group (details in Annex 1 of the Impact Assessment) was consulted and gave input to this evaluation report.

Open public consultations

The Commission has conducted several open public consultations on the issues related to the present evaluation (see Annex 2 of the Impact Assessment for further details). This evaluation also takes account of the input to the public consultation on the joint roadmap/impact assessment concerning the Digital Services Act.

Workshops with stakeholders

This evaluation also takes account of the information collected through numerous workshops organized with stakeholders on number of issues covered, such as national measures for tackling illegal content online or market and technological developments concerning intermediary service providers (see Annex 2 of the Impact Assessment for further details).

E-commerce Expert Group meetings with Member States

In particular, during the e-Commerce Experts Group meeting of 8 October 2019 125 , the main principles of the e-Commerce Directive has been discussed with Member States, as well as the latest development on national levels (see Annex 2 of the Impact Assessment for further details).

Targeted consultation of the Member States

As part of the evaluation process, the Commission also sent a targeted questionnaire to all Member States raising in particular questions about their experience with the cooperation mechanisms under Article 3 of the e-Commerce Directive, but also experience with the implementation and application of other provisions of the e-Commerce Directive.

The information collected from these groups of stakeholders supported the analysis of the take-up and impacts of the measures. A core part of the evaluation relies on legal analysis, not least in light of the coherence of interpretations in case law. This was analysis also relies on interviews with a number of judges involved in the headline cases as well as other legal experts (see Annex 2). Further, economic data was exploited to understand the evolution of the digital sector.

5.2.Limitations and robustness of findings

As regards the evaluation criterion of efficiency, it proved difficult to collect quantitative evidence on the costs of applying the e-Commerce Directive. While some data has been obtained through the open public consultation, the assessment of costs has been primarily based on the assumed costs based on the modelling of costs for specific type of options considered in the Impact Assessment.

6.Analysis and answers to the evaluation questions

6.1.Effectiveness

Under this section the evaluation was trying to assess to what extent have the initial objectives of the e-Commerce Directive been met and whether some gaps have been identified.

6.1.1.Facilitating the internal market for information society services by removing legal fragmentation

The evaluation shows that in particular, Articles 3 and 4 of the e-Commerce Directive allowed provision and accessibility of information society services cross-border in the internal market. This has been happening across all layers of the internet and the web and has enabled successful entry and growth of many EU companies in different segments of the market (e.g. Zalando, Spotify, Deezer, Booking, 1&1, Seznam, etc.). At the same time the evaluation also showed that some very large platforms with global scale have also managed to enter the internal market and managed to reshape several segments of it.

At the same time, there is clear evidence of legal fragmentation 126 and differentiated application of the existing rules by Member States, including national courts. There is also an increased tendency of Member States to adopt legislation with extraterritorial effects and enforce it against service providers not established in their Member State. Such enforcement in consequence reduces the necessary trust between the competent authorities and undermines the well-functioning internal market for information society services.

In particular, Member States have begun to regulate at national level, increasing the fragmentation of the single market, especially for hosting service providers as one form of intermediary service providers, which includes online platforms.

Furthermore, a number of MS have introduced diverging ‘notice-and-action’ rules. National rules do not only diverge in scope, but also in the specific requirements they set – e.g. minimum content of a notice, limitations to ‘manifestly illegal’ content, interpretation of different means for obtaining ‘actual knowledge’ about an illicit activity and interpretations of ‘expeditious’ removal of content. Furthermore, Member States are starting adopt legislation, which not only sets specific obligations and sanctions, but also allocates competence for country of destination on some services established elsewhere, by requiring a legal representative within the territory of the country of destination if the service reaches a certain threshold of users on that territory.

In addition, there is also evidence of considerable lack of trust between Member States, as some countries seriously doubt the willingness of the authorities of the country of destination to protect the interests of their citizens. The evaluation also shows that Member States often do not use the cooperation mechanism provided for in the e-Commerce Directive.

As a result, this fragmentation makes it even harder for smaller EU companies to scale up at home, and gives an edge to very large platforms who can put legal teams in every country. This also leads to uncertainties in the application and enforcement of law across the internal market.

6.1.2.Removing legal uncertainty in relation to liability of intermediary service providers

The liability safe harbour provisions laid down in Section 4 of the e-Commerce Directive have provided for a necessary minimum of legal certainty for online intermediaries to emerge and scale across the single market and to develop innovative services catering to the needs of consumers. However, conflicting interpretations in national court cases (sometimes even within the same Member State) have introduced a significant level of uncertainty; in addition, an increasing fragmentation of the single market raises barriers for EU scale-ups to emerge.

The liability regime has only partially reached its objective to incentivise the effective removal of illegal content in particular in the absence of legally binding, common procedural obligations (“notice-and-action”) across the internal market. This has in turn also lead to an increased fragmentation of requirements at national level (see also section 6.1.1 above).

In addition, several stakeholders referred to an uncertainty brought by case law interpretations 127 on what activities would qualify a hosting service as an ‘active’ host, in opposition to a ‘passive’ as necessary to benefit from the conditional liability limitation, created disincentives for platforms, in particular SMEs, to apply voluntary, proactive measures against illegal activities.

In addition, in view of the significant developments of the digital economy and services, the question arises under which conditions new type of intermediary services (e.g. virtual private networks; content delivery networks) could be benefit from the liability safe harbour.

Other factors equally contribute to the lack of effectiveness: exclusion of operators established outside of the EU and emergence of mega-platforms that, by their sheer reach, aggravate the extent of harm caused by the dissemination of illegal content, and a lack of transparency and reliability of results when platforms do take measures.

Similarly, the provisions have only partially achieved the balancing objective of protecting fundamental rights. They provide stronger incentives for the removal of content than to protect legal content and also lack appropriate oversight and due process mechanisms especially in situations of so-called ‘privatised law enforcement’.

Finally, the current system tends to ask platforms in particular to take decisions on the legality and removal of content, often also without meaningful transparency on processes and outcomes.

6.1.3.Removing disincentives for consumers going online and concluding transactions online

The e-Commerce Directive seeks to harmonize all steps in the provision of an information society service (from an establishment of the information society service provider and information about its services, to provisions of contracts, advertising the service, etc.). The objective of the e-Commerce Directive was to ensure that consumers are clear about guarantees when going online and that the provision of digital services is ensured across the internal market.

To ensure this the e-Commerce Directive identified primarily set of embryonic transparency obligations concerning general information about the information society service provider (Article 5), (unsolicited) commercial communications (Articles 6-8), recognition and treatment of contracts (Article 9) and information prior to placing orders (Articles 10-11).

The evaluation shows that while the provisions have set the minimum conditions for consumer trust and provision of digital services and are still valid today, they have been largely complemented by a rich corpus of further rules and harmonisation measures in the areas such as consumer protection and conclusion of contracts at a distance, including by online means. Furthermore, as shown through several enforcement actions by the CPC Network, some provisions, such as the information requirements applicable to the information society service providers, suffer from a patchy and very different compliance by the information society service providers.

Furthermore, the fundamental changes in the variety and scale of information society services, as well as of the technologies deployed and online behaviour, have led to the emergence of new challenges, not least in terms of transparency of online advertising and algorithmic decision-making consumers and businesses are subject to.

In addition, there are several new technological developments that raise many important questions as to their use and possible legal implications. For example, in relation to electronic contracts the use of blockchain technology and smart contracts is increasingly getting traction, which raises certain regulatory questions.

Finally, the evaluation does not allow concluding on the actual scope of the implementation and use of codes of conducts and out-of-court dispute resolution mechanisms by Member States in relation to digital services in general or limited scope therefore more specifically.

6.1.4.Preliminary conclusion on effectiveness

It can therefore be concluded that while the e-Commerce Directive, and in particular Articles 3 and 4 of the e-Commerce Directive, has provided an important incentive for the growth of the internal market for information society services and enabled entry and scaling up of new service providers, the initial objectives have not been fully achieved.

In particular, the exponential growth of the digital economy and appearance of a new type of service providers raises certain new challenges that require reflection of possible update to the existing objectives. In addition, these developments and challenges put an additional strand on achieving already existing objectives as the increased legal fragmentation and undermining of the well-functioning of internal market for information society services shows.

Several new regulatory instruments (see in particular section 3.3 above) make valuable contributions to the attainment of some of the policy objectives set out in the e-Commerce Directive. Yet, while providing sector specific solutions for some of the underlying problems (e.g. in addressing the proliferation of specific type of illegal activity), they do not necessarily address the same issue for the entire digital ecosystem (e.g. because they are limited to certain types of services, certain types of content, or to operators established within the EU territory). Furthermore, while the voluntary measures have generally shown positive results they cannot be legally enforced nor do they cover all participants in the digital economy.

In addition, these measures do not adequately address the problem of fragmentation of the single market, due to a growing number of national legal measures, nor do they address the problem of uneven and ill-coordinated enforcement across the internal market (although sector-specific regulators, such as in the area of media, clearly contribute to that objective as well).

Moreover, these measures do not solve the problem of lack of legal clarity concerning the scope of the liability provisions of the e-Commerce Directive itself (e.g. which types of services would be covered by the relevant provisions of the e-Commerce Directive). As a result, they cannot address the disincentives for intermediary service providers to act more proactively nor do they provide for the overall transparency and accountability for the behaviour of measures of (in particular large) intermediary service providers (concerning the effectiveness of their measures and their impact on freedom of expression and information).

6.2.Efficiency

A core efficiency test for the E-Commerce Directive relates to the costs and frictions in the cooperation across member states’ authorities, in line with Article 3 of the Directive.

The principle in itself, and the cooperation across Member States, has been fundamental to cutting a significant duplication of costs across authorities and ensuring the level of effectiveness in the supervision of digital services (see previous sections).

However, there is a lack of clarity and reliability of response in the cooperation mechanism, which increases the uncertainties for Member States. In addition, with several mechanisms available (see Annex 7), several Member States reported that it is not clear which channel should be used. The duplication of efforts and the lack of procedural clarity generates significant costs. Quantification of these costs was not possible on the basis of the available and reported data from the Member States. Based on the differences in the use of the cooperation mechanisms 128 , it is clear that Member States experience these costs to different extents.

In terms of costs for businesses, the E-Commerce Directive only imposes a limited number of obligations – such as information requirements (Article 5) or disclosure regarding commercial communications (Article 6). Instead, the Directive harmonises measures and provides legal certainty to reduce costs and make sure that there is a level playing field across the Union.

At the same time, in light of the evolution of the digital sector, evolving case law and, importantly, increasing legal fragmentation, significant costs have emerged for digital services. These are presented in detail in the Impact Assessment report, and relate in particular to the legal uncertainties and fragmentation emerged ‘on top’ of the E-Commerce Directive. The loss of internal market in this regard is estimated between 1 and 1.8% of total turnover from cross-border provision of digital services.

6.3.Relevance

Under this section of the evaluation was trying to assess to what extent are the initial objectives of the e-Commerce Directive still relevant today.

6.3.1.Facilitating the internal market for information society services by removing legal fragmentation

The evaluation shows that the proper functioning of the internal market for information society services very much remains a valid objective.

The overwhelming majority of the replies to the open public consultation point to the importance of preserving the internal market principle if one is to ensure that any type of digital service provider that aspires to start up and grow in Europe may do so. The evaluation confirms that the internal market principle is instrumental for service providers to grow and expand cross-border. It also shows that only very large, well established information society service providers have the capacity to comply with 27 potentially diverging legal obligations and with 27 ill-coordinated enforcement systems.

In addition, information society services are subject, to a varying extent, to sectorial regulation enforced by a number of national and European regulators – from data protection authorities for personal data protection, to media, telecom, competition or consumer protection authorities. This means that all regulators are confronted with a similar set of challenges in the extremely diverse and technology-savvy environment of digital services. Subsequently stronger means of cooperation, sharing of best practices and technical information, and coordination in enforcing the law, remain paramount for the robustness of the internal market, the effectiveness in enforcing the law and the protection of all EU citizens.

6.3.2.Removing legal uncertainty in relation to liability of intermediary service providers

The evaluation shows that the clarifications in the c-Commerce Directive concerning the liability of intermediary service providers for third part information and activities have been an important contributor to the growth of the digital economy and services in the internal market.

The evaluation also showed that the absence of a liability exemption safe harbour could incentivise the over-removal of legal content, and therefore be at odds with the fundamental freedoms, such as freedom of expression and information. The same applies to the existing prohibition in the e-Commerce Directive to impose general monitoring obligations on intermediary service providers, which would disproportionately burden these providers while at the same time incentivising them to "over-remove" (legal) content so as to avoid the risk of fines or litigation.

The evaluation also confirms that the objectives to have in place an effective tools that would ensure effective removal of illegal activities or information while safeguarding the freedom of expression and information are more relevant than ever for several reasons:

·First, the volumes of content intermediated by information society service providers continue to grow at an unprecedented pace, and so does their reach into society. This also increases the risk of illegal activity as well as its potential impact.

·Second, increasing endeavours by information society service providers to reduce their users' exposure to illegal or harmful content – triggered by legal requirements and/or the increasing automation of content management systems – also increases the risk that legal content is removed erroneously. As some digital platforms are now one of the main venues for information and expression, including for political expression, any content moderation rules have a direct and immediate impact on fundamental rights, and a careful balance needs to be struck.

6.3.3.Removing disincentives for consumers going online and concluding transactions online

As for the objective to promote trust in the digital ecosystem by providing consumers and users with adequate information, the evaluation shows that the overall objective remains valid, while the underlying problems have evolved in view of the significant developments in the area.

This applies in particular to the area of online advertising that has evolved from the commercial communications activities existing at the time when the e-Commerce Directive was adopted. While a number of public policy concerns such as privacy and data protection (i.e. GDPR, ePrivacy Directive), content of advertisements and consumer information (i.e. UCPD, AVMSD), or follow the money solutions to counter illicit activities (e.g. Memorandum of Understanding with online advertising industry against counterfeit) are being addressed elsewhere, the evaluation shows that several issues deserve further assessment. In particular, as also raised by several stakeholders during the open public consultation, a chain of intermediary services has emerged in between the publishers and the advertisers, and further clarity as to their status and responsibility is needed. In addition, and as regards new developments, the evaluation shows that the ad placement process remains largely opaque, to both consumers and related services, and therefore lacks meaningful transparency.

Furthermore, the evaluation also shows that today digital services are fundamentally shaped by a series of algorithmic processes designed to optimise the way information flows are intermediated, from ranking algorithms and recommender systems, to content detection and filtering technologies. Some aspects of this are already regulated through an existing EU legislation. For example, the GDPR sets specific rules for a sub-set of such processes, based on the processing of personal data, and the Platform-to-Business Regulation sets obligations on disclosure of the main parameters of online ranking on platforms intermediating relations between business users and consumers, as well as search engines.

However, these measures do not cover the entire spectrum of issues emerging when algorithmic decision-making is used at scale.

Furthermore, as shown above (see section 6.1.3), while the e-Commerce Directive supported the use of electronic means for conclusion of contracts, there have been significant technological developments in this area. In particular, with the increased use of blockchain technology and smart contracts there is a question whether the existing framework laid down in the Directive remains fully relevant in its current scope.

6.3.4.Preliminary conclusion on relevance

The evaluation shows that the objectives of the e-Commerce Directive continue to remain valid, while at the same there are several new developments that may not be well reflected in the existing public policy objectives, including as they have developed since the adoption the e-Commerce Directive.

In the first place, the open public consultation as well as targeted submissions by stakeholders, reports 129 prepared for the European Parliament or Council conclusions 130 confirm that the existing principles and objectives of the e-Commerce Directive remain very valid also today. This is particularly the case for ensuring the well-functioning internal market for information society services built on internal market principle and preserving liability safe harbour for intermediary service providers while clarifying the application of the conditions to new services that developed since the adoption of the e-Commerce Directive.

In addition, while many of the issues identified at the time of the adoption of the e-Commerce Directive have been complemented by a series of laws concerning consumer protection, contract law, misleading advertising, and the like. That said, new information asymmetries have arisen in the meantime – such as in the areas of algorithmic decision making (with an impact on how information flows are intermediated online), or in online advertising systems – that render this objective as relevant as ever. In some cases, the sale of advertising is a core part of platform’s business models and, while the platform offers distinct services, there is a dependency of incentives across the components of the business model.

6.4.Coherence

Under this section of the evaluation was trying to assess to what extent is the e-Commerce Directive coherent with the other regulatory interventions applicable and relevant for the provision of the information society services in the internal market.

6.4.1.General assessment

Since the adoption of the e-Commerce Directive not only is the market and technological landscape significantly different, but also regulatory framework applicable to information society service underwent numerous changes.

As shown in section 3.3 above and in Annex 6 of the Impact Assessment, several EU legal acts have been adopted since the adoption of the e-Commerce Directive that deal with specific aspects of the information society services.

Having said that, the present evaluation did not identify any instance of in-coherence with the existing rules or other policy initiatives in the areas concerned. There are several reasons for this:

·First, the e-Commerce Directive was adopted at an early stage of the internet and development of e-commerce, which allowed the European co-legislators to adopt a horizontal framework applicable to information society services at the time when many of the challenges that appeared later did not exist yet.

·Second, the legislative intervention of the e-Commerce Directive was based on a principle that it should address only what is strictly necessary to ensure well-functioning internal market for information society services and already recognized that there are several aspects that are adequately addressed elsewhere (e.g. consumer protection; data protection).

·Third, subsequent legislative interventions, such as AVMSD, Copyright Directive, CSAM Directive or even some elements of the consumer acquis, clearly recognized that horizontal principles of the provision of information society services are laid down in the e-Commerce Directive. In this context, all subsequent legal interventions clarified that these acts do not to replace principle of the e-Commerce Directive, but build on and complement them or deal with specific regulatory issues that e-Commerce Directive as a horizontal instrument does not.

In this context, the subsequent EU legal acts did not interfere with basic horizontal principles of the e-Commerce Directive and preserved the coherent interplay with the rules in place. Annex 6 of the Impact Assessment provides an overview of how some of the most relevant rules adopted subsequent to the e-Commerce Directive, but dealing with some of the aspects of the e-Commerce Directive, interplay with its rules. As shown, in none of the cases any in-coherence has been identified.

Finally, the evaluation of the e-Commerce Directive also did not point to any internal in-coherence in the Directive itself.

6.4.2.Preliminary conclusion on coherence

The evaluation showed that the e-Commerce Directive is generally coherent with other EU interventions that took place since its adoption. The evaluation also did identify any internal in-coherence of the e-Commerce Directive.

6.5.EU added value

Under this section of the evaluation was trying to assess to what extent has the e-Commerce Directive added value as oppose to a scenario where the Directive would have never been adopted.

6.5.1.General assessment

Before the e-Commerce Directive came into force some Member States had already made use of regulatory systems for information society services, which however differed in objectives and means. Other Member States had no rules in place. This had resulted in a significant regulatory fragmentation, which consequently led to fragmentation of the internal market and lack of legal certainty for providers and recipients of information society services.

In this context, the adoption and implementation of the e-Commerce Directive established for the first time a common framework applicable to all Member States. There had been no substantial trend towards coordination of a common framework on information society services by Member States before the evaluation period. Although it cannot be excluded that some rules on e-commerce could also be established at the international level in particular in the context of multilateral regulatory frameworks (e.g. GATS), there are no indications that either WTO or any other body had the intention to do so. This is also confirmed by the fact that only recently some members of the WTO, including the EU, announced intention to launch talks 131 on e-commerce, which could address some, but not all, of the issues that the e-Commerce Directive deals with.

Based on the above it thus does not seem to be an overly strong assumption that, without EU intervention, Member States would have continued applying their own regulatory systems without any common set of principles also during the evaluation period. This assumption was also used as baseline scenario of the explanatory memorandum of the e-Commerce Directive and the present evaluation, while the latter taking also into account that some aspects relevant for the provision of information society services in the EU have been subject to further harmonization measures (e.g. consumer protection; measures against specific type of illegal content).

The evaluation confirms that the different and diverging legal regimes applicable to information society services increase compliance costs while also being the source of legal uncertainty as to the applicable obligations across the EU and of unequal protection of EU citizens. In addition, the effects of any action taken under national law are limited to a single Member State and there are no guarantees that in absence of an EU intervention a common set of principles would underpin provision of such services in the internal market.

The principles of the e-Commerce Directive, in particular country of origin and prohibition of prior authorization, as well as legal certainty deriving from clearly established horizontal rules enabled growth of information society services and their cross-border expansion. The latter trend has been further facilitated through sector and issues specific rules that were adopted since the e-Commerce Directive adoption.

However, the evaluation also shows that while the initial objectives remain relevant, the current regulatory trends in some Member States put a significant pressure on their achievement, since the increasing trend of regulatory fragmentation can be observed again. This does not only risk undermining the exercise of fundamental rights under the Treaty, such as free movement of services, but raises risks of legal uncertainty both for service providers and recipients, which in turn leads to lack of trust between Member States and in the internal market itself.

6.5.2.Preliminary conclusion on EU added value 

At least part of the actual benefits of the e-Commerce Directive that the evaluation identified could be considered as EU added value. It is likely that Member States would have continued applying their own regulatory systems without any common set of principles and that some Member States would have continued to have no horizontal rules in place at all.

In the absence of robust evidence, it is however not possible to draw firm conclusions on the extent of this EU added value.

7.Conclusions

The aim of the e-Commerce Directive was to ensure the freedom of providing digital services in the internal market, leading to growth and competitiveness in the EU and offering consumers a wide-range of choices and opportunities, including by ensuring that the Internet remains safe, trustworthy, fair and open.

The specific objectives of the Directive were (i) ensuring well-functioning internal market for digital services, (ii) ensuring effective removal of illegal content online in full respect of fundamental rights and (iii) ensuring adequate level of information and transparency for consumers.

As regards the effectiveness of the e-Commerce Directive the evaluation shows that while the e-Commerce Directive, and in particular Articles 3 and 4 of the e-Commerce Directive, has provided an important incentive for the growth of the internal market for information society services and enabled entry and scaling up of new service providers, the initial objectives have not been fully achieved.

In particular, the dynamic growth of the digital economy and appearance of a new type of service providers raises certain new challenges that require reflection of possible update to the existing objectives. In addition, these developments put an additional strand on achieving already existing objectives as the increased legal fragmentation and undermining of the well-functioning of internal market for information society services shows.

The evaluation showed that while several new regulatory instruments make valuable contributions to the attainment of some of the policy objectives set out in the e-Commerce Directive, they provide sector specific solutions for some of the underlying problems (e.g. in addressing the proliferation of specific type of illegal activity). They therefore do not necessarily address the same issue for the entire digital ecosystem (e.g. because they are limited to certain types of services, certain types of content, or to operators established within the EU territory). Furthermore, while the voluntary measures have generally shown positive results they cannot be legally enforced nor do they cover all participants in the digital economy.

In addition, these measures do not adequately address the problem of fragmentation of the single market, due to a growing number of national legal measures, nor do they address the problem of uneven and ill-coordinated enforcement across the internal market (although sector-specific regulators, such as in the area of media, clearly contribute to that objective as well).

Moreover, these measures do not solve the problem of lack of legal clarity concerning the scope of the liability provisions of the e-Commerce Directive itself (e.g. which types of services would be covered by the relevant provisions of the e-Commerce Directive). As a result, they cannot address the disincentives for intermediary service providers to act more proactively nor do they provide for the overall transparency and accountability for the behaviour of measures of (in particular large) intermediary service providers (concerning the effectiveness of their measures and their impact on freedom of expression and information).

As regards the efficiency of the e-Commerce Directive, the Directive imposed only limited additional costs for Member States' administrations and providers of digital services. The evaluation has not revealed particularly high or disproportionate costs and no substantial concerns have been raised regarding impacts on SMEs. As noted above, the Directive has had a positive impact on the well-functioning internal market for digital services and contributing to legal certainty in areas such as liability of intermediary service providers. In the absence of the Directive, it is unlikely that any of these benefits would have materialised.

The Directive’s efficiency has nevertheless been reduced by the limitations to its effectiveness, in particular due to the numerous developments since its adoption, discussed above. The main concern in this regard is related to the lack of clarity in the cooperation mechanism across member states, creating burdens and duplication of costs, despite the opposite objective of the Directive. This has essentially reduced its efficiency in maintaining the functioning of the internal market.

In relation to question of continued relevance of the objectives pursued by the e-Commerce Directive, the evaluation shows that the objectives of the e-Commerce Directive continue to remain valid, while at the same there are several new developments that may not be well reflected in the existing public policy objectives.

In the first place, the open public consultation as well as targeted submissions by stakeholders, reports 132 prepared for the European Parliament or Council conclusions 133 confirm that the existing principles and objectives of the e-Commerce Directive remain very valid also today. This is particularly the case for ensuring the well-functioning internal market for information society services built on internal market principle and preserving liability safe harbour for intermediary service providers while clarifying the application of the conditions to new services that developed since the adoption of the e-Commerce Directive.

In addition, while many of the issues identified at the time of the adoption of the e-Commerce Directive have been complemented by a series of laws, new information asymmetries have arisen in the meantime. This is for example the case in the areas of algorithmic decision making (with an impact on how information flows are intermediated online), or in online advertising systems that render this objective as relevant as ever.

The evaluation showed that the e-Commerce Directive is generally coherent with other EU interventions that took place since its adoption. The evaluation also did identify any internal in-coherence of the e-Commerce Directive.

Finally, at least part of the actual benefits of the e-Commerce Directive that the evaluation identified could be considered as EU added value. It is likely that Member States would have continued applying their own regulatory systems without any common set of principles and that some Member States would have continued to have no horizontal rules in place at all. In the absence of robust evidence, it is however not possible to draw firm conclusions on the extent of this EU added value.

Annex 6: Supporting analysis for legal basis and drivers – legal fragmentation

As the Inception Impact Assessment advanced, the intervention addresses the freedoms of establishment and to provide services and the proper functioning of the Single Market for digital services. As such, the legal basis considered likely would be Article 114 of the Treaty of the Functioning of the European Union and, potentially, Articles 49 and 56 (to the extent that the conditions of establishment would represent an overweighting element of the legal intervention).

The Inception Impact Assessment already identifies the existing and increasing legal fragmentation as a main problem: in response to the increasing role of digital services in the online trade in or dissemination of illegal goods and content, Member States are increasingly passing laws with notable differences in the obligations imposed on digital services, in particular online platforms, and with a variety of different enforcement mechanisms. This creates a fragmentation of the single market that can negatively affect EU citizens and businesses in the absence of harmonised rules and obligations. It also entails a lack of legal clarity and certainty for digital services in the internal market, and is likely to be less effective in achieving the underlying public policy objectives.

The increasing legal fragmentation of the digital single market underpins the need to set up harmonized rules for information society services offered in the EU. The present annex presents the evidence on the potential choice of Article 114 TFEU as a relevant legal option for the legal instrument.

Article 114 TFEU establishes that the European Parliament and the Council shall, acting in accordance with the ordinary legislative procedure and after consulting the Economic and Social Committee, adopt the measures for the approximation of the provisions laid down by law, regulation or administrative action in Member States which have as their object the establishment and functioning of the internal market.

Following well-established case-law of the CJEU 134 , this Article is the appropriate legal basis where there are differences between Member State provisions which are such as to obstruct the fundamental freedoms and thus have a direct effect on the functioning of the internal market, and a possible legal basis for measures to prevent the emergence of future obstacles to trade resulting from differences in the way national laws have developed.

While Article 114 TFEU is the legal basis for measures improving the Internal Market, and usually only EU services providers can benefit from the EU Internal Market, this Article can also be used to impose obligations on services providers established outside the territory of the EU where their service provision affects the internal market, when this is necessary for the desired internal market goal pursued. This has been the case already for the Regulation on Geoblocking 135 , the Regulation on promoting fairness and transparency for business users of online intermediation services 136 or the Commission proposal for a Regulation on terrorist content online 137 .

Finally, Article 114 TFEU can also serve as a legal basis to impose an obligation to third country companies to appoint a representative within the territory of the Union, insofar as this is merely incidental with regard to the main purpose or component of the act. This is the case, for instance, for the NIS Directive, exclusively based on Article 114 TFEU.

In order to consider whether Article 114 TFEU constitutes an appropriate legal basis for the proposed instrument, the following chapters present the existing legal fragmentation in the field of measures targeting online platforms in particular, be it to specify the conditions of secondary liability, or to impose specific duties of care of due diligence obligations vis-à-vis users as regards the way they conduct business.

1.Main drivers leading to regulatory fragmentation

Examination of the current regulatory context for information society services in the EU shows that those services, and especially online intermediaries, are subject to significant regulatory fragmentation across the digital Single Market.

The ECD constitutes the horizontal regulatory framework for information society services established in the EU. It contains the core principles and rules governing digital services across the EU. Despite the wide scope of its coordinated field, the ECD seems to lack a sufficient level of harmonization to provide a uniform application of its main rules and principles across the EU.

In particular, the ECD does create a limited liability regime for online intermediaries regarding potentially illegal content being transmitted or hosted in their service. It does not, however, provide harmonized rules on how online intermediaries are to address such content. As a result, Member States have adopted national rules applicable to the service providers established in their territory creating specific obligations to tackle illegal content.

According to our research and available information, this situation is the result of: (i) the diverging way taken by Member States to transpose the ECD as regards Articles 12-15, (ii) country-specific notice-and-action procedures or other due diligence obligations as regards the content they host 138 ; and, (iii) recent national laws being increasingly adopted by Member States whose scope would also apply to cross border services.

2.Transposition of Directive 2000/31/EC as regards Articles 12-15

The ECD sets out the liability regime applicable to online intermediaries of information online, mere conduit, caching and hosting services. For the purposes of this report, we will focus on the category of hosting services regulated under Article 14 of the Directive, which are most concerned by the ramping legal fragmentation.

Hosting services are defined as those information society services consisting of storing information at the request of the recipient of the service. For these services, Article 14 establishes a limited exemption of liability for third party content under certain conditions:

-The provider does not have actual knowledge or is not aware of the existence of illegal content; and

-Upon obtaining such knowledge or awareness, takes expeditious action to remove or block access to the content.

In the context of the transposition of Article 14 into their national legal systems, Member States have adopted various legal regimes applicable to their home hosting services.

National legal systems range from a quasi-literal transposition of Article 14, without any further clarification on the obligations for hosting services, to stricter and more detailed rules on the systems to be put in place by such services to remove or disable illegal content. In particular, the lack of a harmonized system to trigger “actual knowledge” has been understood by some Member States as pointing to a so-called “notice-and-takedown” system. The Commission has also taken this approach in the 2017 Communication on tackling illegal content online and the subsequent 2018 Recommendation on effective ways to tackle illegal content online, which encourage Member State to establish such notice and action obligations for the hosting services under their jurisdiction.

The Commission already pointed at these divergences in its third Implementation report of the ECD 139 . Recently, the Commission commissioned and published an external study to look into the different regimes adopted by Member States in their transposition of Article 14. 140 The findings of the study point at a clear national fragmentation in the national legal mechanisms adopted. 141

The information available in the study shows that the majority of Member States have followed an almost verbatim transposition of Articles 14. Among the rest of Member States, not only do Member States have different options as to contemplate notice-and-action procedures and minimum requirements for notices, but also with regard to when ‘expeditious action’ occurs, what is understood by ‘knowledge’ and as defined in Article 14, and what specific provisions Member States have in terms of safeguarding the freedom of expression.

“Expeditiously”

Article 14 ECD requires HSPs to act “expeditiously” upon obtaining actual knowledge or awareness of illegal content. However, the exact meaning of this term is unclear, in particular because of an absence of EU case-law and diverging national legislations and case-law 142 . It can be concluded that national courts interpret “expeditiously” on a case-by-case basis taking into account a number of factors such as: the completeness of the notice, the complexity of the assessment of the notice, the language of the notified content or of the notice, whether the notice has been transmitted by electronic means, the necessity for the HSP to consult a public authority, the content provider, the notifier or a third party and the necessity, in the context of criminal investigations, for law enforcement authorities to assess the content or traffic to the content before action is taken

“Actual knowledge”

There are different interpretations amongst Member States’ national laws as to the exact conditions under which a hosting service provider is deemed to actual knowledge of the illegal activity or information stored by a third party.

As already analyzed in full in the Impact Assessment accompanying the document Proposal for a Regulation on preventing the dissemination of terrorist content online 143 , when transposing the ECD or during its application, some Member States have limited to courts or administrative authorities the power to refer illegal content to an online platform or to trigger the platform's liability when doing so.  In such national regimes, only the referral by courts or administrative authorities is able to generate sufficient knowledge on the service provider over the specific illegality.

In addition, mention should be made to the interpretation that national courts have been making of the national provisions transposing Article 14 ECD. The resulting national case law provides further detail on how the national measures need to be interpreted in particular cases. Similarly, the CJEU, in particular through the case law resulting from preliminary rulings, has also clarified and interpreted how some of the notions set out in Article 14 ECD, and thus the edges around the safe harbour for hosting services, shall be understood.

As a result, the current framework set out in the ECD does not lead to harmonized and uniform rules for hosting services as regards illegal activities or information intermediated on their platforms. On the contrary, the current scenario shows a mosaic of varied national regimes applicable to hosting services based on the Member State of establishment.

Consequently, users and customers in the EU are not offered a minimum level of uniform protection against illegal content and products intermediated on hosting services. The protection of users and costumers, as well as their ability to participate in the process of addressing illegal content online (for instance via notice systems) greatly depends on the Member State of establishment of the service provider. . This is also true when it comes to the safeguard of their fundamental freedoms online, mainly freedom of expression. Users accessing service providers established in Member States that have enacted specific provisions to safeguard freedom of expression in the process of content moderation online will benefit from stronger protection of their fundamental freedoms.

The specific duties and obligations regarding the processing of illegal content to which they will be subject is likely to be a factor of consideration for hosting services when deciding where to establish themselves in the EU. In accordance to the internal market principle, in complying with relevant national rules hosting services established in a Member States will, in principle, be able to lawfully offer their services across the Single Market. Consequently, hosting service providers are likely to establish themselves in Member States with less stricter regimes for the take down of illegal content. 144 This form of forum shopping has a direct impact on the protection of users and costumers in the EU, both as regards illegal content and to what concerns the safeguard of their fundamental freedoms.

An additional aspect to be considered is the growing importance of hosting services being offered in the EU from third countries. The ECD applies to service providers established in one of the Member States of the EU. Consequently, the regulation of hosting service providers established is not harmonized at EU level, which leaves a vacuum in the protection of EU citizens from illegal content being intermediated on this countries and available in the Single Market.

3.Country-specific notice-and-action procedures or other due diligence obligations as regards the content they host

The ECD does not, however, harmonize the duties or procedural obligations for hosting services in addressing illegal information and activities on their services. Paragraph 3 of the Article 14 expressly recognizes the ability of Member States to adopt national rules in this regard for hosting services established in their territory.

An extensive overview of national initiatives to put in place a notice-and-action procedure was included already in the Impact Assessment accompanying the document Proposal for a Regulation on preventing the dissemination of terrorist content online 145 . Furthermore, some Member States have in the meantime legislated in the field (France, Germany) or have notified their intention to do so (Austria). Summarising details, it results that:

-Nine Member States (Finland, France, Germany, Greece, Hungary, Italy, Lithuania, Spain and Sweden) have implemented a notice-and-action procedure in their legislative frameworks. For five of them (Finland, Greece, Hungary, Italy and Spain) this only applies to copyright infringements and related rights thereof.

- Furthermore, in several Member States (Finland, France, Hungary, Lithuania), minimum requirements for the notice are defined within law, to ensure that it is sufficiently motivated. In Member States without statutory requirements for notices, the case law has provided indications concerning the content of the notice and the mechanism.

Furthermore, the mentioned Annex shows that key elements such as the minimum content of the notice, the possibility to issue a counter-notice, the timeframe to react to a notice, potential mandatory measures against abusive notices or the possibility to submit contentious cases to an independent third party diverge greatly from one Member State to another.

As a consequence, an online platform offering for instance a video uploading feature, established in one EU Member State, should adapt its reporting functionalities to allow for copyright claims under the specific conditions established by law in Finland, Hungary, Lithuania, the United Kingdom, Spain, Sweden and by case-law in Belgium, the Czech Republic, Germany or Italy (and try to comply with contradicting rulings). For that purpose, it should probably hire and maintain in-house specialists and subcontract local legal experts in each and every Member State where it desires to offer its services. Furthermore, users will see their fundamental rights protected differently when posting content or when signalling illegal content, depending on the place where the content is hosted, or the citizen lives.

Specific provisions safeguarding the freedom of expression

The blocking of certain content can have a negative impact on the exercise of the rights to freedom of expression and information. Therefore, in recital 46 the ECD states that the removing or disabling of access ‘has to be undertaken in the observance of the principle of freedom of expression and of procedures established for this purpose at national level’.

In practice, this requirement often translates in certain obligations on hosting providers to set up and operate content moderation processes that allow affected users to submit counter-notices to defend the legality of the information at issue.

·In 13 Member States, some form of opportunity to dispute the allegation exist. However, the situation in which counter-notices are possible differ greatly amongst Member States. For example, a counter-notice in Estonia is only possible when the removal order is ordered by a government agency; in Finland, Greece, Hungary, Ireland, Italy and Spain counter-notices are only possible in the context of copyright; and in Luxembourg, it is only possible during the merit procedure.

In eight Member States (Bulgaria, Estonia, France, Germany, Greece, Lithuania, Portugal and Sweden), some sort of alternative dispute settlement mechanism exist. For example in Portugal, there is an out of Court preliminary dispute settlement possible in case the illegality of the case is not obvious; in Estonia, a specific alternative dispute regime exists for copyright infringements, in which a specific committee can resolve disputes.

4.Recent national laws being increasingly adopted by Member States whose scope would also apply to services provided from another Member State 146

Points (i) and (ii) above already present a very fragmented picture of the legal framework to which a hosting service provider has to comply in the EU, and the different degrees of impact on the content shared by users over their services. This general fragmentation is furthermore exacerbated by recent trends in some Member States to regulate services regardless of their place of establishment, despite the general prohibition to restrict the cross-border provision of services.

Indeed, in the past few years we have observed an increasing interest in the regulation of information society services in Member States. This trend targets mainly duties and obligations for online platforms to address content hosted in their services that would be illegal under national law, but also other kinds of “duties of care”, transparency and cooperation with national authorities, not least by imposing obligation to appoint a legal representative in the territory of several Member States.

Some of the recent national measures adopted by Member States in this regard aim to apply to those hosting services with a distinctive presence in their national markets. As such, these national laws would also cover hosting services, regardless of their establishment, including the possibility to impose sanctions.

Member States have justified the adoption of national laws with extraterritorial application on the need to protect their citizens against the impact of online platforms when intermediating content, be it illegal content or not. They claim the regime set out in the ECD, and in particular the available derogations from the internal market principle, is not sufficient to ensure the protection of their national users in view of the realities of the online environment.

Regardless of the justification, proportionality or adequacy of the policy goal behind such national initiatives, the extraterritorial application of most of these national measures to online platforms established outside the concerned Member States adds to the existing legal fragmentation in the Single Market.

4.1.Examples 147

-Network Enforcement Act of 2017 (Netzwerkdurchsetzungsgesetz or “NetzDG”)

In force since 1 January 2018, in 2017 the German authorities adopted the first national law of this kind imposing on social networks certain obligations to allow for the swift detection and removal of content that would constitute a criminal offence under national law. The aim was to improve the enforcement of German criminal law online, notably in terms of deletion of content.

In terms of scope, the obligations set out in the NetzDG apply to social networks with at least two million registered users in the Federal Republic of Germany. The NetzDG lists a set of 22 criminal offences covered by such obligations, including some as criminal defamation and hate speech which determination is largely contextual.

The NetzDG main obligations include a requirement for social networks under its scope to set up a notification system allowing users to report to the platform individual pieces of content which would constitute a criminal offence under German national law. Social networks are also required to implement procedures that ensure obviously unlawful content is deleted within 24 hours of receiving a complaint. If there is any doubt regarding a takedown decision, the procedure may take up to seven days. After that deadline, a final decision on the lawfulness of a post must be reached and unlawful content needs to be removed, that is, either blocked or deleted. The fines for a breach of this obligation can reach up to €50 million.

In addition to complying with this operational provision, social media platforms are also obliged to publish bi-annual reports. In July 2019, the Federal Office of Justice issued an administrative fine of 2 million EUR against Facebook for incomplete reporting. The main argument was related to the relatively low number of complaints filed under the NetzDG compared to other social media providers, which the authorities took as an indication for the complaint from being too difficult to find.

-Draft Act combating right-wing extremism and hate crime

On 18th June 2020, Germany enact a new Act that would, among others, amend the 2017 NetzDG. The aim of the amendment would be strengthen the fight against illegal content on social networks by facilitating the prosecution of criminal offences by German law enforcement authorities.

With this aim, the amendment would impose new obligations to social networks under the scope of the original NetzDG. In particular, it creates a new requirement for such services to report to German law enforcement authorities certain content which has been removed/disabled and which constitutes sufficient evidence of a serious criminal offence. This obligation to report also includes user data of the uploader, including IP address and passwords.

In practice, this obligation is likely to require social networks to carry out an additional assessment of the content removed or disabled to determine whether it can be deemed to constitute sufficient evidence of a serious crime and would thus need to be reported. The assessment as to whether there is such evidence is often highly contextual and therefore complex. Moreover, the assessment is not guided by a legal standard that would help the providers determine whether or not there is sufficient evidence to justify reporting the content in question. Failure to comply with the new obligations is subject to the same financial penalties as foreseen in the current NetzDG.

-Draft Act amending the Network Enforcement Act

Separately, at the time of drafting this report, the German authorities are also working on an additional amendment to the 2017 NetzDG. According to the information facilitated by the German authorities 148 , this amendment aims at further improving the systems set out in the current NetzDG in order to make the fight against illegal content online more effective.

The amendment would impose on social networks under the scope of the NetzDG further and more detailed obligations in terms of the systems to allow users to send notices, the procedure for the removal or disabling of access and the reporting and transparency requirements. These new duties thus constitute new or stricter obligation for social networks having at least 2 million registered users in Germany, including those providing cross border services into Germany. Failure to comply with the new obligations is subject to the same financial penalties as foreseen in the current NetzDG.

-Law aimed at combating hate content on the internet (Loi contre la cyberhaine or Avia Law)

In May 2020 the French National Assembly adopted the so called Avia Law which imposed strict obligation on online platforms and search engines as regards notice and take down or disabling access to illegal content. The Law was aimed at fighting against hate speech and other forms of illegal content disseminated making use of hosting services.

The French authorities argued that the adequate protection of French citizens from content that would be illegal under French law called for strict regulation of online platforms and search engines available in the French territory, regardless of their place of establishment. As such, the Law would apply to those services surpassing a certain threshold of connections from the French territory (to be established at a later stage by decree).

The text adopted by the National Assembly imposed on online platforms and search engines strict obligations in terms of systems to send notices and, specially, of removal or disabling access to notified content. According to the Law, services would be required to remove or disable access to individual pieces of manifestly illegal content within 24 hours of receiving notification; and within 1 hour for child pornography and terrorist content.

Services under the scope of the Law would also be subject to reporting and transparency obligations on their content moderation activities and technical and human means devoted to it. The French regulatory authority would also be granted broad powers of supervision and enforcement, including the issue of binding guidelines.

The Law would subject individual failures to comply with the removal or disabling of access to individual pieces of content, within the prescript timeframes, to significant financial penalties of up to EUR 250.000.

By decision of June 2020, the French Conseil Constitutionnel concluded that the main obligations set out in the Law would create a disproportionate impact on fundamental rights and would thus be contrary to the French Constitution. Consequently, most of the requirements for online platforms and search engines were declared nulled.

Some other Member States are currently working on or have announced their intention to enact national laws aimed at imposing obligations of online platforms to tackle illegal content online. From the information available at the time of drafting this report, it is likely that these upcoming laws would also be designed to apply to services available in the concerned territory, regardless of their establishment. As such, these would add to the already increasingly fragmented legal framework for online services in the EU. 149  

4.2.The specific requirement of appointing a legal representative

Under EU law, in general a service can be provided from one Member State to recipients established in a different Member State. Limitations to that free provision of services can only be justified based on overriding reasons of general interest, and insofar as they are proportionate and adequate. Under the ECD, an information society service provider only needs to comply with the rules under its place of establishment. Member States cannot regulate (or impose restrictions to the provision of services to) providers established in a different Member State.

However, as explained above, more and more Member States target services regardless of their place of establishment. This generates an enforcement challenge, as these services –not established in their territory- are outside their jurisdiction. In order to be able to enforce those rules, in recent years, laws enacted in several Member States and targeting online platforms in different sectors include an obligation, for platforms in scope but not established in their territory, to appoint a legal representative within their territory.

This has been the case, for example and not exhaustively, in the German NetzDG, the French Avia Law, the recently notified Austrian draft law to combat hate speech online, the German draft law to protection of minors 150 or the Italian “Airbnb” law 151 .

Furthermore, in a case related to a similar obligation in Spain 152 , the Court of Justice already established that a national provision imposing an obligation to appoint a tax representative resident in that Member State would contravene Article 56 TFEU for being disproportionate to the objective pursued.

4.3.National laws on data sharing

Member States are increasingly regulating the access of public authorities to data that online platforms hold. The majority of these national laws are applicable to online platforms offering services in the area of collaborative economy. The rationale behind these laws is that Member States need data from platforms so that they can enforce the obligations applicable to the providers of the underlying services (e.g. obligations related to taxation, health and safety, planning, registration and so on).

In March 2020, the European Commission reached an agreement with 4 large platforms in the area of short term rental accommodation services on data sharing. This agreement allows Eurostat to publish aggregate data on short-stay accommodations offered via these platforms across the EU. However, cities do not consider aggregate data to be sufficient for the purposes of enforcement of local rules.

Some examples of the regulatory fragmentation regarding data reporting are the following:

-Spain: a Royal Decree 153 imposed the obligation on platforms intermediating short-term rental accommodations for touristic purposes to provide the Tax Authorities with data on a quarterly basis as from January 31st, 2019 and through the Government platform. The data relate to the identity of the homeowner, the property, the guest, number of renting days, amounts perceived by the homeowner for the renting of the property.

Another draft Royal Decree still to be finally adopted would also set out obligations relating to document registration and information for natural or legal persons offering accommodation and motor-vehicle rental services is under preparation. The draft Royal Decree establishes the same obligations simultaneously for providers of underlying services (accommodation or motor-vehicle rental services) and for digital platforms dedicated to intermediation in these activities via the Internet. Digital platforms must collect and register information related to the provider of the underlying service, the place where the service is provided, the user and the transaction itself. The notified draft provides a lighter regime for ‘web portals which act exclusively in the area of publishing classified ads, which do not directly or indirectly provide payment functionalities’ by not obliging them to collect and register additional data than the ones collected in the normal course of their activities.

- Czech Republic: A recent law 154 imposes data sharing obligations on online short-term accommodation platforms. The data to be communicated to the authorities are the number of tourism service contracts concluded, the total price for tourism services for the period specified, the address of the place where the tourist services are provided, the price for the service or the number of contracts concluded per host, the designation of the service provider with which it has mediated the conclusion of a contract relating to the provision of tourism services to the customer (for a natural person, his or her name and surname, date of birth and permanent address must be provided).

- France: Law N°2018-898 enacted on October 23rd, 2018 and entered into force on January 31st, 2020 requires short-term rental platforms to share with the French Tax Administration a yearly report regarding the vacation rental properties advertised through online platforms. Data to be provided relate to the identification of the providers of accommodation services, revenue etc.

- Austria: Several laws have been adopted at regional level regulating data sharing in the area of tourism. The Act on tourism promotion in Vienna (Vienna Tourism Promotion Act – WTFG) 155 provide that platforms need to notify to the authorities the contact data of the accommodation providers registered with them, along with all the addresses of the accommodation (accommodation units) registered with them within the territory of the city of Vienna. The provincial Act promoting tourism in Upper Austria (Upper Austrian Tourism Act 2018) 156 sets out obligations for platforms to forward (upon request) data about the service providers to the Upper Austrian authority responsible for collecting tourist tax. The Act amending the Styrian Act on Overnight Accommodation and Holiday Home Tax 157 online platforms are requested to forward information on service providers, not upon request, but automatically following a new registration of a host. Platforms are also requested to submit an overview of bookings every 3 months. 

- Italy: Law Decree No. 50/2017 and its implementing measure 158  impose the obligation on online platforms intermediating short-term rental services to transmit data relating to the short lease contracts concluded on their platforms to the Agenzia delle Entrate.

- Greece: A law 159 in Greece introduces data sharing obligations for online platforms. According to the rules, platforms must share specific data related their sellers with the tax authorities, upon their request.

4.4.Impact on hosting services and users in the EU

Without prejudice to the legitimacy of the policy objective and capacity to block illegal content, the application of several, diverging national laws imposing obligations on the same online platforms as regards intermediated content increases the legal fragmentation in the Single Market. As such, it has considerable repercussions for both digital service and users across the EU.

Contrary to the scenario based on the internal market principle, online platforms wishing to scale up and offer their activities across the EU are required to comply with various national legal systems.

This entails that online platforms are faced with higher compliance costs. Although complex to quantify, information provided to the Commission in the context of the recent amendments to the NetzDG indicate an additional cost per provider of EUR 2.1 million annually and one-time compliance costs of EUR 300 000, for the first amendment, and one-time compliance cost of EUR 284 000, for the second amendment. 160

In the specific case of obligations to appoint a legal or tax representative, for instance, and having in mind that –without prejudice to the cost estimations made by the Commission in this document- the German NetzDG estimated that such an obligation would imply a cost of EUR 1 million annually, this would mean that a platform of a relative size –to be covered by the mentioned national laws- would need to invest EUR 4 or 5 million yearly only to comply with these obligations.

Aside from higher economic costs, online platforms wishing to offer their services in more than one Member State are also faced with higher legal uncertainty. In fact, service providers would need to closely monitor and follow the legislative processes and case law in all Member States were they are present. They would need to constantly adapt their policies to the various national legislative and judicial developments.

In practice, this fragmented regulatory environment is likely to result in only large online platforms being able to innovate and scale up in the EU, to the detriment of smaller or emerging services. Regulatory fragmentation thus endangers the full completion of the digital Single Market.

The extraterritorial application of these national rules aimed at counteracting illegal content online does not ensure an adequate and uniform protection of all EU citizens. Users residing in Member States having enacted stricter rules are likely to be afforded a higher level of protection against illegal content in such Member State. This level of protection would not extend to other EU citizens. An indirect incentive of this unlevelled protection may be an additional pressure on other Member States to enact similar rules, which would in turn add to the already increasing legal fragmentation in the EU.

5.Concluding remarks

The current ECD does not harmonize the rules applicable to online intermediaries as regards third party illegal information (content or products) being disseminated on their services, or other due diligence obligations such as transparency reporting. Especially in the context of hosting services, this lack of a European wide harmonized framework has resulted in increasingly regulatory fragmentation in the EU.

The recent regulatory trends existing at national level create clear risks for the digital Single Market and prevent both businesses and users from reaping all its potential benefits.

In this context, in order to complete the Single Market for online platforms while ensuring an adequate and uniform level of protection of all EU citizens, it seems necessary to create harmonized rules for online platforms available in the EU.



Annex 7: Regulatory coherence

a)Initiatives taken to address the problem of illegal content and suspicious activities online

Initiatives

Purpose and Scope

Relationship with ECD; assessment

Commission Recommendation of 2018 on measures to effectively tackle illegal content online - C(2018) 1177

Hosting service providers to exercise a greater responsibility in content governance to swiftly detect, remove and prevent the re-appearance of illegal content online, based on:

·Clearer 'notice and action' procedures.

·More efficient tools and proactive technologies, where appropriate.

·Stronger safeguards to ensure fundamental rights.

·Special attention and support to small companies.

·Closer cooperation with authorities.

As a non-binding instrument, the Recommendation cannot be enforced and it does not reach “bad-faith” operators nor operators established in third countries

Directive 2019/790 on copyright and related rights in the Digital Single Market

(the “Copyright Directive”)

The Directive covers online content-sharing services (services giving public access to large amount of copyright-protected content uploaded by their users).

Art 17 introduces a new conditional liability regime for online content sharing services.

Article 14(1) of the ECD does not apply to the situations covered by Article 17 of the Copyright Directive

The obligation for online content-sharing services to make their best efforts to ensure the unavailability of specific works and to prevent their future re-uploads, which should be carried out in cooperation with right holders, “shall not amount to general monitoring obligation” provided for by article 15 of the ECD

The Copyright Directive does not cover any hosting service providers other than those captured by the definition of “online content sharing services”

Directive 2018/1808 amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (“AVMSD”)

The Directive covers video-sharing platform services providing programmes or user-generated videos to the general public

Art 28(b) requires that MS ensure that video-sharing platform services take appropriate measures to protect minors from harmful audiovisual content, and to protect the general public from audiovisual content constituting illegal hate speech and audiovisual content whose dissemination constitutes a criminal offence under Union law (i.e. terrorist content, CSAM), as well as appropriate measures to ensure compliance with audiovisual commercial communications requirements under the AVMSD

In the event of a conflict between the ECD and the AVMSD, the AVMSD shall prevail, unless otherwise provided for in the AVMSD

Article 28(b) is “without prejudice to Art 14 ECD” and “shall not lead to ex-ante control or upload filtering which do not comply with Article 15 ECD

Does not cover any hosting service providers other than those captured by the definition of “video-sharing platform services”. It only covers certain categories of illegal audiovisual content and harmful content for minors.

Art 28 provides for a “notice” mechanism for VSP, but not for a general notice and action system, e.g. for hateful text comments.

Does not cover types of illegal audiovisual content other than Illegal hate speech, terrorist content, CSAM, as well as content which is harmful for children (i.e. may impair their physical, mental or moral development) and content infringing audiovisual commercial communications rules set by the AVMSD

Directive (EU) 2017/541 on combating terrorism

Article 21 of the Terrorism Directive requires Member States to take the necessary measures to ensure the prompt removal of online content constituting a public provocation to commit a terrorist offence, as referred to in Article 5 that is hosted in their territory.

Article 21 also stipulates that measures of removal and blocking must be set following transparent procedures and provide adequate safeguards, in particular to ensure that those measures are limited to what is necessary and proportionate and that users are informed of the reason for those measures. Safeguards relating to removal or blocking shall also include the possibility of judicial redress.

The Directive should be without prejudice to the rules laid down in the ECD, in particular to the prohibition of general monitoring and the limited liability regime.

Proposal for a Regulation on preventing the dissemination of terrorist content online COM/2018/640 (*negotiations between the co-legislators are ongoing, hence some of the provisions in the Commission’s proposal can be modified)

The Regulation covers hosting service providers (which make information available to third parties).

The proposal requires such providers to:

-Remove or disable access to content within 1h of receiving a legal removal order from a competent authority in any MS. Give feedback to the competent authority.

-Assess as a matter of priority the content identified in referrals from competent authorities in any MS and give feedback to them.

-Report on the proactive measures taken, if they are exposed to terrorist content. These may include measures to detect, remove and prevent reappearance of terrorist content, following a removal order by a competent authority. When putting in place proactive measures, providers should ensure that users’ right to freedom of expression and information is preserved. If the measures are not considered sufficient, the authority in the place of establishment can impose appropriate, effective and proportionate proactive measures.

-Comply with a set of transparency and information obligations

-Establish complaint mechanisms and adopt other safeguards to ensure that decisions taken concerning content are accurate and well-founded.

-Inform national authorities, when they become aware of evidence of terrorist offences.

Have a legal representative established within the Union, if they are not established within the Union. All providers should appoint a contact point with authorities.

The Regulation is “without prejudice to Art 14 ECD”; a recital introduces “Good Samaritan” elements.

The decision to impose specific proactive measures does not, in principle, lead to the imposition of a general obligation to monitor, as provided in Article 15(1) ECD. In exceptional circumstances, the authorities can derogate from Article 15 ECD, by imposing specific, targeted measures, the adoption of which is necessary for overriding public security reasons. A balance should be struck a fair balance between the public interest objectives and the fundamental rights involved, in particular, the freedom of expression and information and the freedom to conduct a business, and provide appropriate justification.

Only addresses the relationship between providers and public authorities (including Europol), setting out procedures for legal removal orders, as well as for referrals of content sent by the authorities – does not directly address notices coming from users.

Regulation 2019/1020 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011

Regulation (EU) 2017/2394 of the European Parliament and of the Council of 12 December 2017 on cooperation between national authorities responsible for the enforcement of consumer protection laws and repealing Regulation (EC) No 2006/2004

(“CPC Regulation”)

Sets minimum powers for competent authorities to require, ‘where no other effective means are available’, for a hosting service provider to remove, disable or restrict access to an online interface (i.e. website) or, where appropriate, to order domain registries or registrars to delete a domain name infringing rules in the Union laws that protect consumers

The e-Commerce Directive is included in the corpus of laws in scope of the Regu