Choose the experimental features you want to try

This document is an excerpt from the EUR-Lex website

Document 52017AE5365

    Opinion of the European Economic and Social Committee on the ‘Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions — Tackling Illegal Content Online — Towards an enhanced responsibility of online platforms’ (COM(2017) 555 final)

    EESC 2017/05365

    OJ C 237, 6.7.2018, p. 19–25 (BG, ES, CS, DA, DE, ET, EL, EN, FR, HR, IT, LV, LT, HU, MT, NL, PL, PT, RO, SK, SL, FI, SV)

    6.7.2018   

    EN

    Official Journal of the European Union

    C 237/19


    Opinion of the European Economic and Social Committee on the ‘Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions — Tackling Illegal Content Online — Towards an enhanced responsibility of online platforms’

    (COM(2017) 555 final)

    (2018/C 237/03)

    Rapporteur:

    Bernardo HERNÁNDEZ BATALLER

    Consultation

    European Commission, 17.11.2017

    Legal basis

    Article 304 of the Treaty on the Functioning of the European Union

    Section responsible

    Section for the Single Market, Production and Consumption

    Adopted in section

    9.3.2018

    Adopted at plenary

    14.3.2018

    Plenary session No

    533

    Outcome of vote

    (for/against/abstentions)

    180/4/5

    1.   Conclusions and recommendations

    1.1.

    Illegal online content is a complex and cross-cutting issue that needs to be tackled from a range of perspectives, both in terms of assessing its impact and harmonising the way it is dealt with in the legal framework of the Member States.

    The European Economic and Social Committee (EESC) emphasises the importance of establishing a suitable and balanced regulatory framework for platforms in the digital single market that could help to establish a climate of trust, both for businesses and for consumers in general, enabling them to use platforms with confidence. Regulatory and self-regulatory policy approaches that are flexible, sustainable and respond directly to challenges should be adopted, in particular for procedures relating to the detection, investigation, notification and removal of illegal content on platforms.

    1.2.

    As regards the adoption of criteria and measures, the EESC considers it necessary to maintain consistency with the recommendations of its previous opinions. The point of departure should be the fact that what is illegal in the real world is also illegal online. The EESC stresses the importance of technology neutrality and of coherence between rules that apply online and offline in equivalent situations, to the extent necessary and possible.

    1.3.

    It is essential to achieve the best possible balance between upholding fundamental rights and the planned restrictions of illegal content. Such balance is also needed between online platforms of varying size and pursuing different activities.

    The EESC calls on the Commission to take appropriate measures against the growing presence of violent and/or discriminatory messages on platforms, stressing the importance of protecting vulnerable people and children and combating all forms of racism, sexism, incitement to terrorism and harassment, including in the digital environment.

    1.4.

    Attention should be paid in particular to the effectiveness of actions taken in relation to those online platforms whose headquarters are located outside EU territory.

    Likewise, the Commission should review and catalogue illegal content to the extent that this is possible, so that other forms of content that are not specifically mentioned in the Communication can be incorporated.

    In any event, the application of the guiding principles for detection, investigation, notification and withdrawal procedures should be encouraged in the following cases:

    (a)

    to defend rights that are recognised by international conventions, such as those aimed at:

    protecting children from any digital content which may be contrary to the provisions of the Convention on the Rights of the Child,

    protecting persons with disabilities from any digital content which may be contrary to the provisions of the Convention on the Rights of Persons with Disabilities;

    (b)

    to guarantee the absence of gender-based discrimination in digital content, in particular as regards the application of the principle of equal treatment of men and women in accessing and supplying goods and services and guaranteeing gender equality and human dignity in advertising;

    (c)

    to ensure that digital content complies with the provisions of the Digital Agenda to enhance safety and consumer rights in the digital society.

    The Communication should include a reference to the significance that illegal content can have for the Single Market, so that the necessary preventive measures can be adopted to ensure that it can continue to operate in accordance with the principles that underpin it.

    1.5.

    Ultimately, the EESC strongly welcomes the European Commission’s initiative in presenting this Communication which, in general, provides a sound approach to tackling the presence of illegal content on online platforms. To this end, consideration should be given to the possibility of reviewing the content of the E-Commerce Directive, the Unfair Commercial Practices Directive and the Directive on Misleading and Comparative Advertising, on the basis, inter alia, of standards that are valid for the future, technologically neutral and vital to the development of European platforms, so as not to spread uncertainty among economic agents or limit access to digital services.

    2.   Background

    2.1.

    Online platforms are a type of information society service provider that acts as an intermediary in a given digital ecosystem. They include a wide range of actors participating in numerous economic activities, such as e-commerce, the media, search engines, the collaborative economy, non-profit activities, the distribution of cultural content or social networks. There is no clear and precise definition of online platforms and it is difficult to formulate one due to their continually evolving nature. Currently, they play an important role in the internal market, a role that will only increase in the future.

    2.2.

    The Commission has already addressed the issue of online platforms in relation to the Digital Single Market (1), recognising that the most important challenge facing the EU today when it comes to ensuring its future competitiveness in the world is effectively promoting innovation in these economic sectors, while at the same time protecting the legitimate interests of consumers and users in an appropriate way. It had therefore been intending to revise the directives on telecommunications, privacy and electronic communications as regards the current situation of the OTT online communications services.

    2.3.

    The EESC (2) has already set out its views about this Communication, highlighting that many online platforms are important elements of the collaborative economy, and has reaffirmed its findings on the collaborative economy, especially with regard to consumer protection, workers and self-employed people. It has also stressed the need to address the risk of regulatory fragmentation, as a result of which it considers it necessary to adopt a consistent approach throughout the EU.

    2.4.

    In this Communication the Commission addresses the fight against illegal online content. It endeavours to increase the accountability of online platforms by laying down a series of principles and guidelines for online platforms to help them intensify their fight against illegal online content in cooperation with national authorities, Member States and other stakeholders.

    2.4.1.

    The aim is to step up the implementation of good practices in the prevention, detection, removal or disabling of access to illegal content, in order to:

    (a)

    ensure that it is effectively removed;

    (b)

    increase transparency and the protection of fundamental rights online;

    (c)

    provide clarification for platforms on their liability when they take proactive steps to detect, remove or disable access to illegal content (the so-called ‘Good Samaritan’ actions).

    2.4.2.

    The EU legal framework consists of binding and non-binding standards, notably the E-Commerce Directive (3) which harmonises the conditions under which certain online platforms can benefit from exemption from liability for illegal content that they host across the Digital Single Market.

    2.4.3.

    A harmonised and consistent approach to removing illegal content does not exist at present in the EU, since what is considered illegal is determined by specific legislation at the EU level, as well as by national law. A more aligned approach would make the fight against illegal content more effective and would also benefit the development of the digital market.

    2.4.4.

    The Communication examines the criteria to be established for the conduct of online platforms, competent authorities and users when it comes to detecting illegal content quickly and efficiently. To this end, the Commission considers that their cooperation with the competent authorities of the Member States should be systematically enhanced and that Member States should ensure that courts can react effectively against illegal online content, and should strengthen cross-border cooperation.

    2.4.4.1.

    It also believes that, in order to ensure that illegal online content is removed more quickly and in a more reliable way, mechanisms need to be established to facilitate the work of ‘trusted flaggers’. These are specialised entities with specific expertise in identifying illegal content, and dedicated structures for detecting and identifying such content online. The Commission will explore the potential of agreeing EU-wide criteria for trusted flaggers.

    2.4.4.2.

    As regards communication with users, online platforms should establish an easily accessible and user-friendly mechanism to enable their users to report content hosted by them that they consider to be illegal.

    2.4.4.3.

    With regard to ensuring high quality communications, the Commission believes that effective mechanisms need to be put in place to facilitate the submission of notices that are sufficiently accurate and substantiated.

    2.4.5.

    The relevance of establishing proactive measures in online platforms is evaluated with regard to the exemption of liability and the use of technology for the detection and identification of illegal content.

    2.4.6.

    The removal of illegal content is another of the issues examined in the Communication, which seeks strong safeguards that reduce the risk of removing legal content. The Commission seeks compliance with the requirement to act ‘expeditiously’ when withdrawing content and reporting crime to law enforcement authorities, as well as to foster transparency with regard to the platforms’ content policy and ‘notice-and-action’ procedures.

    2.4.7.

    As for the establishment of safeguards against over-removal and abuse of the system, the Commission examines how notices are contested and the measures against bad-faith notices and counter-notices.

    2.4.8.

    Ways of preventing the reappearance of illegal content are explored by examining the possibility of introducing measures that discourage users from repeatedly uploading illegal content of the same type in order to end its dissemination, and defending the further use and development of technologies aimed at preventing the reappearance of illegal content online, for example by using automatic re-upload filters.

    2.4.9.

    In short, the Communication serves as a guideline but does not change the applicable legal framework or contain legally binding rules. Its objective is, firstly, to provide guidance to online platforms on the best way to live up to their responsibilities when it comes to tackling the illegal content they host. It also aims to mainstream good procedural practices against different forms of illegal content, and to promote closer cooperation between platforms and competent authorities.

    3.   General Comments

    3.1.

    The EESC acknowledges this Communication and calls on the Commission to establish programmes and take effective measures to provide a stable and consistent legal framework for the efficient removal of illegal content. It also considers it to be timely, given the impact that digital platforms have on daily life today as well as the risks posed by their widespread use and the impact on the digital single market, the purpose of which is to avoid fragmentation between national legislations and remove technical, legal and fiscal obstacles in order to enable businesses, citizens and consumers to benefit fully from digital tools and services.

    The EESC stresses the need for online platforms to combat illegal content and unfair commercial practices (e.g. the reselling of entertainment tickets at extortionate prices), through regulatory measures complemented by effective self-regulatory measures (e.g. through very clear terms of use and appropriate mechanisms to identify repeat offenders, or by setting up specialised content moderation teams and tracing illegal content) or by adopting hybrid measures.

    3.2.

    The EESC considers that cases of illegal content should be reviewed and catalogued, so that they are not limited to those set out in the Communication (incitement to terrorism, xenophobic speech that publicly incites hatred and violence, child sexual abuse material). Other cases could be included in this regard, such as those related to clearly malicious defamation, the distribution of material that violates human dignity, or sexist content that contributes to gender violence, without going so far as to produce an exhaustive list of such cases, and with the aim of establishing a uniform set of criteria for cataloguing them.

    Therefore, the application of the guiding principles for detection, investigation, notification and withdrawal procedures should be encouraged in the following cases:

    (a)

    to defend rights that are recognised by international conventions, such as those aimed at:

    protecting children from any digital content which may be contrary to the provisions of the Convention on the Rights of the Child,

    protecting persons with disabilities from any digital content which may be contrary to the provisions of the Convention on the Rights of Persons with Disabilities;

    (b)

    to guarantee the absence of gender-based discrimination in digital content, in particular as regards the application of the principle of equal treatment of men and women in accessing and supplying goods and services and guaranteeing gender equality and human dignity in advertising;

    (c)

    to ensure that digital content complies with the provisions of the Digital Agenda to enhance safety and consumer rights in the digital society.

    3.3.

    The EESC is in favour of strengthening measures to combat illegal online content, in particular as regards the protection of minors, and the removal of content related to hate speech and incitement to terrorism. It therefore requests that the need to avoid harassment and violence against vulnerable people be taken into consideration.

    3.4.

    This is so despite the fact that the concept of illegal content in digital environments varies from one Member State to another from a legal point of view, and from one person to another from an ethical point of view. Thus, examples could be provided that are less obvious than those cited above, in which the question of whether content is illegal will depend on the interpretation and resolution of those conflicts in which there is a clash between fundamental rights, such as freedom of expression, and other recognised rights, and these rights must therefore be balanced as much as possible in order to prevent such clashes. However, the importance of taking action against the dissemination of fake news should be highlighted, which is why the EESC feels that online platforms should provide users with the tools to report fake news, so that other users can be made aware that the veracity of the content has been called into question. In addition, online platforms could develop partnerships with trusted flaggers, i.e. certified fact-checking sites in order to enhance their users’ trust in the validity of the online content.

    3.5.

    For illustrative purposes, cases of illegal content ought to be set out in such a way that, as far as possible, a common understanding may be established among Member States, thus enabling them to prioritise and determine the limits of these cases. In this regard, we propose including cases relating to the following:

    national security (terrorism, corruption, drug trafficking, arms trafficking, tax evasion and money laundering),

    protection of minors (pornography, violence, etc.),

    human trafficking, prostitution and gender-based violence, including sexist advertising,

    protection of human dignity (incitement to hatred or discrimination based on race, gender or ideology, or as regards sexual orientation),

    economic security (frauds and scams, piracy and counterfeiting, etc.),

    security of information (criminal hacking, collection of data for commercial reasons, avoidance of competition, disinformation, etc.),

    protection of privacy (cyber-bullying, leaking and use of personal data, interception of personal communication, interception of people’s location, etc.),

    protection of reputation (defamation, illegitimate comparative advertising, etc.),

    intellectual property.

    3.5.1.

    It is also necessary to spell out more clearly the concepts of ‘illegal content’ and ‘dangerous content’, in order to avoid biased interpretations of these concepts.

    3.6.

    Due to its potential consequences, particular attention must be paid to the concentration of economic power of some digital platforms as well as to the development, processing and distribution of purely informative content, which only appears to be legal — in other words content that purports to hide illegal or even dangerous elements.

    This should also be extended to anything that relates to megadata and the benefits that online platforms obtain by exploiting these data.

    3.7.

    Moreover, given the global nature of the problem in question, it is worth noting the possibility of analysing and considering the relevant cooperation and reciprocity initiatives in an optimal and efficient manner, based on principles such as information, choice, ongoing transfer, security, integrity of data, access and implementation.

    4.   Specific comments

    4.1.

    With regard to the general context, this is a good moment to consider revising the E-Commerce Directive, which was adopted in 2000, as well as those on Unfair Practices (2005) and Advertising (2006). In particular, elements relating to the newly emerging economic models should be considered, as well as other situations where no conventional commercial relationship exists; in any case, the liability regime for content on platforms should be strengthened in a systematic way throughout the EU, and gaps in compliance should be eliminated. All this is needed in order to strengthen legal security and increase the confidence of businesses and consumers.

    4.1.1.

    In any case, measures should be introduced to deal with websites that violate the provisions of these directives, including the possibility of blocking access to websites by means of transparent procedures. Moreover, adequate safeguards should be provided to ensure that restrictions are necessary and proportionate and that users are informed of the reasons for the restriction. These safeguards will also include the possibility of judicial redress.

    4.1.2.

    With regard to detection and reporting of illegal content, the Communication states that national courts and authorities can adopt protective and other measures to remove or block access to illegal content and that this should be taken into consideration. These steps should be accompanied by measures drafted using the wording set out by the EESC for the measures laid down in the regulation on cooperation between consumer authorities (4).

    4.2.

    Likewise, mechanisms would need to be established for identifying who was responsible, along with response procedures enabling ex ante and ex post authorisations to be revoked. The measures to be taken in each case would also need to be set out, bearing in mind the context and available information.

    4.3.

    Those aspects relating to the authorisation of content in connection with previous notices could also be made more specific. For example, lists could be drawn up of those online platforms that harbour illegal content, as well as those developing best practices which have official recognition. This would help increase competition for the best reputation and improve trust in the internet.

    Innovation favours investments in research, development and improving workers’ skills, and is of crucial importance when it comes to generating new ideas and developments. Technological innovation should be used for procedures to detect, identify and remove illegal content and prevent it from being republished, such as processing of information and digital intelligence, and the use of automatic detection and filtering technologies; however, ultimately, people’s individual decisions need to be made and their actions taken in a way that guarantees fundamental rights and democratic values.

    Reaffirming that a balance must be found between upholding fundamental rights and restricting illegal content, the EESC stresses that the use of current automated filter technology places a disproportionate burden on the freedom of intermediaries to conduct their business, the right of end-users to the freedom of expression and the right to protection of personal data. One-size-fits-all solutions, such as automatic re-upload filters, should not be forced upon the industry without considering the specific needs of SMEs in the IT sector. Current best practices of automated filter technology indicate that the systematic application of the human in-the-loop principle is necessary. This is a system in which final contextualised decisions on a smaller number of contestable cases are always made by humans in order to decrease the likelihood of infringing the fundamental right to freedom of expression. It should be made clear that Artificial Intelligence must not replace decisions taken by human beings on the basis of ethical assessments.

    4.4.

    With regard to notices, the accreditation procedures for trusted flaggers should be addressed. Likewise, as regards ‘ensuring the high quality of notices’, the appropriateness of disseminating notices publicly should be noted.

    4.5.

    The Communication does not provide a clear proposal as regards proactive or preventive measures, or those related to re-education, enabling a series of relevant political initiatives to be rolled out. Such precision is key when it comes to taking on the fight against illegal digital content in an integrated and efficient way.

    4.6.

    Provision should be made for cases where decisions need to be revised, so as to ensure that they can then be reversed and any content that was deleted in error or reported for malicious reasons can be restored, including in particular out-of-court claims systems, with a code of conduct that sets out penalties in the case of non-compliance.

    The EESC calls for effective systems to be put in place for complaint procedures and dispute resolution, thus simplifying the way SMEs and consumers can exercise their rights.

    4.7.

    As regards the withdrawal of illegal content, the effectiveness of the proposals should be increased by introducing a clearly dissuasive element such as the public disclosure of the measures adopted in the framework of the corresponding legal certainty, as this would also strengthen the standards of transparency, which is a necessary condition for the successful, effective implementation of any legislative proposal.

    A high level of protection should be ensured between platforms, consumers and other economic actors. It is important to promote the transparency of the system and to encourage cooperation among the platforms themselves, as well as between the platforms and the authorities, so as to take further steps in the fight against illegal content.

    4.8.

    Finally, the specific proposals aimed at children should be extended to other vulnerable groups in the adult population and could be adapted according to their level of vulnerability.

    Brussels, 14 March 2018.

    The President of the European Economic and Social Committee

    Georges DASSIS


    (1)  COM(2016) 288 final of 25 May 2016 — Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions — Online Platforms and the Digital Single Market, Opportunities and Challenges for Europe.

    (2)  OJ C 75, 10.3.2017, p. 119.

    (3)  Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on electronic commerce (OJ L 178, 17.7.2000, p. 1).

    (4)  OJ C 34, 2.2.2017, p. 100.


    Top