|
Official Journal |
EN C series |
|
C/2024/4052 |
12.7.2024 |
Opinion of the European Economic and Social Committee
Safeguarding Democracy Against Disinformation
(own-initiative opinion)
(C/2024/4052)
Rapporteurs: John COMER
Carlos Manuel TRINDADE
|
Advisors |
Frank ALLEN (for Mr COMER) Paulo PENA (for Mr TRINDADE) |
|
Plenary Assembly decision |
13.12.2023 |
|
Legal basis |
Rule 52(2) of the Rules of Procedure |
|
Section responsible |
Transport, Energy, Infrastructure and the Information Society |
|
Adopted in section |
4.4.2024 |
|
Adopted at plenary session |
24.4.2024 |
|
Plenary session No |
587 |
|
Outcome of vote (for/against/abstentions) |
161/0/2 |
1. Conclusions and recommendations
|
1.1. |
In this opinion, the European Economic and Social Committee (EESC) is concerned with the scope and spread of disinformation campaigns across the European Union before the 2024 elections. This is a follow on from its previous opinions about the Action Plan against Disinformation (1) (2). |
|
1.2. |
Calls on the European institutions for an urgent reflection about the factors that still allow disinformation and misinformation to jeopardise free public speech and to instil a polarised debate where hate prevails over factual information. |
|
1.3. |
Proposes that the European Commission, Council and all the Member States engage in a multi-factor strategy that reinforces all the previous measures to fight disinformation, strengthening the regulations to prevent the use of digital platforms by organised fake identities; allowing citizens to choose how they use social media and digital platforms without the current monopoly of ‘surveillance capitalism’ (3) methods and; take common action to ensure the quality and the pluralism that journalism needs to be the first defence against disinformation. |
|
1.4. |
Given that all the technological means exist to fight disinformation, an articulated strategy is needed and should include all the different dangers we face online, disinformation, hybrid threats and cyber security. This articulated strategy must be taken with full respect for our core democratic values, such as freedom of expression. Private interests cannot overlap the public. |
|
1.5. |
Asks the European Commission to guarantee that information should be available independent of the business model crisis that affects media. All studies reveal that Europe has a problem with pluralism in the media (4). A necessary first step to guaranteeing that pluralism is not dependent on strict market rules, is to classify journalism as a European public good, as suggested by UNESCO (5). Journalists’ editorial freedom, as well as their safety and access to sources of information, are essential. |
|
1.6. |
Requests, after a thorough study of public policies regarding media financing, legislative action that guarantees an adequately funded and independent public service media system. The rights of independent media also need to be protected to ensure pluralism. |
|
1.7. |
Recommends to the Commission, in this regard, to study the viability of a public European news channel, available on different platforms and in all national languages, with an independent editorial commitment that allows European citizens to access the information they need to make informed choices. |
|
1.8. |
Requests the European Commission to develop a plan that ensures quality in local and regional information. |
|
1.9. |
Promotes the adoption of instruments that foster cooperation between national, regional and local news media in Europe. |
|
1.10. |
Asks the European Commission to evaluate the effect of the current regulations applied to social media and digital platforms regarding disinformation and to adopt the necessary legislative action to make it effective. |
|
1.11. |
Considers it essential to review the current regulations regarding targeted advertisements, while requesting further protection regarding individual data gathering. The collection of personal data by social media and digital platforms, in many cases without the informed consent of their users, must be addressed in the future revisions of the data protection legislation. |
|
1.12. |
Asks the Commission and the Parliament to call on digital companies to publicise their list of all the types of user data gathered in social media/platforms. In case this is not obtained through a voluntary act of the companies, the EU should legislate to protect citizens. |
|
1.13. |
Proposes that EU legislation includes a chapter about freedom of choice regarding algorithm design. Citizens should be able to agree, or reject, algorithm features. To balance the rights of intellectual property of the companies and the fundamental rights of citizens, all social media/platforms should allow diverse algorithmic options to their users. Social interest institutions should be allowed to propose alternative algorithm designs to these networks/platforms that could offer a different model of organising information flows, advertisement and data gathering. |
|
1.14. |
Proposes that the Commission regulate the continuous problem of manipulated identities in digital platforms. It ought to be a legal requirement to identify foreign origin bots and to identify bots disguised as persons. |
|
1.15. |
Recommends that the Commission propose a set of conditions to access social media and digital platforms that restricts the use of fake identities, without jeopardising the rights to anonymity and whistleblower protection guaranteed in the current European laws for a comprehensive effort to deliver media literacy, culture and democracy tools, not only at all educational levels, but also targeted to all age groups and minorities. In Finland, the tools to promote critical thinking are an integral part of the educational system from kindergarten to College graduates. This prepares young people to fight all sources of disinformation. Such a system should apply throughout the EU. |
|
1.16. |
Asks for a study that collects and analyses the current level of disinformation monetisation in social media and digital platforms. Turning manipulation and hate speech into profit should not be allowed in EU legislation. |
|
1.17. |
Reminds the European Commission and the Council that disinformation is also a consequence of political neglect. For decades, we have been witnessing a steady increase in inequalities in income, wealth and territory, leading to the breakdown of society and the emergence of separate communities whose convictions are strengthened by the use of their networks reinforced by the use of their specific digital networks. The vulnerable people who lose out as a result of these developments are easy prey for certain types of disinformation. The EECS recommends that all EU policies are reinforced to deal with this problem. |
|
1.18. |
Warns European authorities of the urgent need to assess the risks of addiction that social media and digital platforms create in vulnerable groups of our societies and regulate the commercial use of AI and algorithms not to increase that risk. |
2. Introduction
|
2.1. |
Disinformation can be defined as false, inaccurate or misleading information deliberately created and spread to deceive the public and influence public opinion. Misinformation can be defined as false or inaccurate information, including rumours and gossip spread without malicious intent. |
|
2.2. |
Online disinformation poses a substantial threat to democracies. It erodes trust in institutions and in media and harms democracies. Online disinformation causes some people to believe in bizarre conspiracies. |
Emerging issues
|
2.3. |
The rise of artificial intelligence has caused the development of new forms of disinformation and misinformation by using the artificial production, manipulation and modification of data and multimedia by automated means, especially artificial intelligence algorithms, to mislead or change the original meaning. |
|
2.4. |
Deepfakes use powerful techniques from machine learning and artificial intelligence to manipulate and generate visual and audio content with the potential to deliberately deceive millions of people by spreading fake news, hoaxes and financial fraud. |
|
2.5. |
Disinformation stories can sometimes be categorised as fake news used for propaganda purposes and intentionally designed to mislead and subvert Democratic norms. |
|
2.6. |
Disinformation is frequently used to discredit opposing viewpoints by deliberate misrepresentation and promotion of false conspiracy theories. |
|
2.7. |
The most obvious way that disinformation distorts and undermines democratic debate is by convincing people to believe things that are untrue. |
|
2.8. |
Disinformation and misinformation are not new phenomena. Social media has revolutionised its spread. |
|
2.9. |
Disinformation is used to undermine the integrity and competency of democratic societies, governments and public figures. |
Political effects
|
2.10. |
Disinformation is one of the essential tools for the defenders of autocratic regimes in the current political battle between those and the defenders of democracy and freedom. In the history of warfare, the production of so-called ‘counter-information’ has always been used by one of the contenders to create dissent, weakness or doubt in the social support base of its opponent. |
|
2.11. |
The massive process of disinformation that democratic regimes are currently undergoing, in which the most advanced communications technology is used, is part of the same logic and has the same objective – to win the battle between the defenders of democracy and those of autocracy. |
|
2.12. |
This process against democracy is being carried out by far-right forces and other forms of extremism. The current danger to democracies is represented by those who disrespect liberal democracies, international relations governed by the United Nations Charter, human rights, European integration, the welfare state and the conventions of the International Labour Organisation. In this way, these actors, sometimes supported by autocratic states, have intervened in democratic suffrages, with negative consequences. |
|
2.13. |
The fight against disinformation is therefore one of the most important fronts in the wider battle currently being waged between the defenders of freedom and democracy and autocratic political forces. In this context, if democracy is to win this fundamental battle, decisive action is needed and very significant resources are required. |
Fighting disinformation
|
2.14. |
To be able to fight this disinformation threat, democratic states must strongly support media literacy strategies that are able to empower citizens, especially young people and seniors, with tools to distinguish between information produced with a method of empirical scepticism and conspiracy theories that undermine public confidence. |
|
2.15. |
Democracies around the world are facing a torrent of disinformation and foreign interference operations, which have the potential to destabilise democratic institutions and exacerbate divisions in society, undermining the trust of citizens in democratic institutions. |
|
2.16. |
Bad actors create a fake identity on social media to target particular groups (e.g. particular racial groups or people of a particular sexual orientation) to create social divisions and conflict. |
|
2.17. |
Trolls often use verbal aggression and hate speech, using racist or misogynist language to cause deliberate turmoil and polarisation in the democratic process. |
|
2.18. |
In a Eurobarometer survey (2018) 83 % of respondents said that fake news represents a danger to democracy and were especially concerned about intentional disinformation aimed at influencing elections and immigration policies. |
|
2.19. |
Disinformation concerning all minorities is a serious problem in many Member States. |
|
2.20. |
Much disinformation is targeted at people and groups aimed at reinforcing their ideological beliefs. The widespread tracking of individuals’ internet search history enables purveyors of disinformation to identify people’s preferences and beliefs. |
|
2.21. |
Governments can also be purveyors of disinformation campaigns. The Russian government waged a disinformation campaign in its war on Ukraine. Conspiracy theories about the COVID-19 pandemic and the manipulation of information during the Brexit referendum, have all shown the consequences of disinformation and false narratives leading to widespread erosion of trust in democratic institutions. |
|
2.22. |
In the USA, the disinformation promoted by Donald Trump for the 2020 elections, when he claimed that the election was stolen from him, caused serious political dysfunction. |
|
2.23. |
Social platforms themselves, and the various regulatory bodies, have not been successful in controlling online disinformation and misinformation. The EECS proposes that the Commission study solutions to address this problem, including legislative action. |
3. General comments
|
3.1. |
Democracy in the EU faces major challenges, from rising extremism and election interference to hybrid threats. |
|
3.2. |
The European and Digital Media Observatory (EDMO), that should be reinforced, serves as a hub for fact checkers, academics and other related stakeholders, and is independent of public authorities, including the Commission. It aims to improve detection of online disinformation and to empower citizens to respond to online disinformation. |
|
3.3. |
In 2020, the EU launched the European Democracy Action Plan and, in 2023, it followed this with a Defence of Democracy package. |
|
3.4. |
The main purpose of these proposals is to build capacity in Member States to address risks to elections, disinformation and cyber related threats. |
|
3.5. |
The Digital Services Act (DSA) applies to very large social platforms from August 2023 and will apply to all platforms from February 2024. The DSA protects consumers and their fundamental rights online by setting clear and proportionate rules. It aims to mitigate systemic risks such as data manipulation or disinformation. |
|
3.6. |
The European Media Freedom Act is also a positive attempt, by the Commission, to regulate media pluralism and freedom of information at a time where the media business model crisis is threatening journalism and its method of fact verification. Defending journalists’ editorial freedom, safeguarding their right of access to sources of information, protecting their integrity and ensuring their safety are essential aspects of the defence of freedom of information that the European Union must always ensure. |
|
3.7. |
Free and independent journalism should be declared a European public good, considering its importance as the enabler of free public debate and individual informed choices. |
|
3.8. |
The EESC considers that all these proposals are positive developments in attempting to deal with disinformation. However, it is doubtful if these proposals go far enough to deal with the problem. |
4. Specific comments
4.1. About a Public Service Media
|
4.1.1. |
The EU still lacks a common ‘public opinion’ that frames the common debate about policies that affect all Member States. Without that, the European project is vulnerable to nationalistic biases. Building a strong, common, transparent public debate that avoids stereotypes, should be one of our goals for the future. Considering the support for pan-European, multilingual, information channels may be a first step. |
|
4.1.2. |
Public service media is essential. It must be publicly financed and must be independent of the government of the day. This can be problematic in the case of authoritarian regimes. It is also necessary to have an independent media to ensure balance in public commentary and political issues. The legacy media also have a role to play. |
4.2. About the participation of citizens and civil society in the fight against disinformation
|
4.2.1. |
Misinformation and disinformation is particularly targeted at people who feel alienated from society. We need to build a fairer and more balanced society where people feel part of a genuine community, and reduce the social and economic divide in our society. |
|
4.2.2. |
The EESC shares the Commission’s view that a comprehensive response to disinformation also requires active participation by civil society organisations, including the EESC. |
|
4.2.3. |
Due to the extremely negative/devastating impact of misinformation on European civil society, the EESC will engage on a continuous basis in work dedicated to counteracting this influence. The EESC will appeal to all economic and social committees of the Member States to also actively engage in this intervention in their own country for democracy and against disinformation. |
4.3. About platforms and algorithms
|
4.3.1. |
Social media platforms are not taking sufficient action to take down disinformation. There are a number of regulatory areas that need to be considered. |
|
4.3.2. |
Social media companies must be compelled to post accurate information about sponsors of advertisements. Targeted advertisement should also be further regulated. |
|
4.3.3. |
Social media and platforms should obtain the informed consent of their users regarding data collection. The collection of personal data by social media and digital platforms, without the informed consent of their users, should be seen as a legal problem by the authorities. A list of all types of user data gathered by social media/platforms should be made public by digital companies. |
|
4.3.4. |
Citizens should also be able to agree, or reject, algorithm features. To balance the rights of intellectual property of the companies and the fundamental rights of citizens, all platforms should allow diverse algorithmic options for their users. Social interest institutions should be allowed to propose alternative algorithm designs to these platforms that could offer a different model of organising information flows and data gathering. |
|
4.3.5. |
It ought to be a legal requirement to identify foreign origin bots and bots disguised as persons. The removal of inauthentic accounts and impersonators is essential. |
|
4.3.6. |
Anonymous online accounts should not be allowed, except in cases where it guarantees fundamental rights such as private life and the protection of personal data following the interpretation of European laws judged by the European Court of Justice. |
|
4.3.7. |
Digital platform users should only be entitled to use anonymous profiles in the cases foreseen in current legislation, such as the Whistleblowers Directive. |
|
4.3.8. |
As a transparency measure, consideration ought to be given to an online sign-in system for access to part or all of the Internet. Only identified human beings ought to have access to social media accounts. |
|
4.3.9. |
Every platform should provide safe and transparent login rules so that one person/organisation/institution can have only one profile. |
4.4. About hybrid threats
|
4.4.1. |
Hybrid threats are becoming more sophisticated and more difficult to detect as they use different types of tools and organisational actions. |
|
4.4.2. |
According to the EU Hybrid Fusion Cell, (established in 2016 within the EEAS), hybrid threats by Russia pose the greatest threat because they are systematic, well-resourced and on a different scale to other countries. |
|
4.4.3. |
The Strategic Compass, adopted in March 2022 by the EU, sets out a plan of action for strengthening the EU security and Defence policy by 2030. One of its aspects is the development of a toolbox to counter FIMI threats. |
4.5. About the efficiency in the fight against disinformation
|
4.5.1. |
Fighting disinformation is essential for democracies, for the rule of law and freedom of suffrage. |
|
4.5.2. |
There are, however, differences between the public and the private interests at stake. Both are essential in this fight. But private interests cannot overlap the general public interest. |
|
4.5.3. |
All the technological means to fight disinformation are already available inside the companies that provide online services and social media. |
|
4.5.4. |
The EESC proposes that the Commission and Member States define a coherent strategy, safeguarding freedom of expression and the rule of law, to articulate resources and efficient approaches regarding the different dimensions of the problem. |
|
4.5.5. |
Disinformation, hybrid threats and cybersecurity have, each of them, particular protocols, safeguards and means. However, an articulated strategy, in the EU, among them is necessary for an efficient result. |
|
4.5.6. |
In Finland, tools to promote critical thinking are an integral part of the education system, from kindergarten to college graduates. This prepares young people to fight disinformation from all sources. Such a system should apply throughout the EU. |
Brussels, 24 April 2024.
The President
of the European Economic and Social Committee
Oliver RÖPKE
(1) OJ C 228, 5.7.2019, p. 89.
(2) OJ C 152, 6.4.2022, p. 72.
(3) Zuboff, Shoshana The Age of Surveillance Capitalism – The fight for a human future at the new frontier of power Profile Books Ltd, 2019.
(4) https://cmpf.eui.eu/media-pluralism-monitor-2023/.
(5) ‘Public goods are generally defined as services or commodities available to everyone in society without exclusion. These include health care and education (and supporting institutions), roads, street lighting, and parks. All citizens have access to and/or benefit from public goods. In most cases, public goods are expensive to produce and provide little financial return. While the provision of accessible public goods is normally not financially profitable, society as whole recognizes and values their intrinsic benefits.’ https://unesdoc.unesco.org/ark:/48223/pf0000380618.
ELI: http://data.europa.eu/eli/C/2024/4052/oj
ISSN 1977-091X (electronic edition)