EUR-Lex Access to European Union law

Back to EUR-Lex homepage

This document is an excerpt from the EUR-Lex website

Document 52011DC0556

REPORT FROM THE COMMISSION TO THE EUROPEAN PARLIAMENT, THE COUNCIL, THE EUROPEAN ECONOMIC AND SOCIAL COMMITTEE AND THE COMMITTEE OF THE REGIONS on the application of the Council Recommendation of 24 September 1998 concerning the protection of minors and human dignity and of the Recommendation of the European Parliament and of the Council of 20 December 2006 on the protection of minors and human dignity and on the right of reply in relation to the competitiveness of the European audiovisual and online information services industry-PROTECTING CHILDREN IN THE DIGITAL WORLD-

/* COM/2011/0556 final */

52011DC0556

REPORT FROM THE COMMISSION TO THE EUROPEAN PARLIAMENT, THE COUNCIL, THE EUROPEAN ECONOMIC AND SOCIAL COMMITTEE AND THE COMMITTEE OF THE REGIONS on the application of the Council Recommendation of 24 September 1998 concerning the protection of minors and human dignity and of the Recommendation of the European Parliament and of the Council of 20 December 2006 on the protection of minors and human dignity and on the right of reply in relation to the competitiveness of the European audiovisual and online information services industry-PROTECTING CHILDREN IN THE DIGITAL WORLD- /* COM/2011/0556 final */


REPORT FROM THE COMMISSION TO THE EUROPEAN PARLIAMENT, THE COUNCIL, THE EUROPEAN ECONOMIC AND SOCIAL COMMITTEE AND THE COMMITTEE OF THE REGIONS

on the application of the Council Recommendation of 24 September 1998 concerning the protection of minors and human dignity and of the Recommendation of the European Parliament and of the Council of 20 December 2006 on the protection of minors and human dignity and on the right of reply in relation to the competitiveness of the European audiovisual and online information services industry -PROTECTING CHILDREN IN THE DIGITAL WORLD-

INTRODUCTION

The objective of the 1998 and the 2006 Recommendations on Protection of Minors[1] was to make Member States and industry conscious of the new challenges for the protection of minors in electronic media, particularly those linked to the uptake and growing importance of online services. Considering that regulation cannot always keep pace with these developments, they were called upon to promote and develop appropriate framework conditions by other than purely legal means, e. g. through stakeholder cooperation and co- or self regulation[2].

In the meantime, changes in consumers' and particularly minors' use of media have been dramatic and are constantly accelerating. Media are increasingly being used by minors via mobile devices, including (online) video games, and there are more and more on-demand media services on the Internet. As a new phenomenon since the last Recommendation, social networking sites have gained huge importance, both for individual users and in societal terms. More changes that we might not even think of yet are likely to come.

These new developments offer many opportunities for minors, but bring some challenges regarding their protection, considering that parents often have difficulties in carrying out their responsibilities in relation to new technology products and services that are usually less known to them than to their children. We must therefore ask the question whether current policies are still suitable and adequate to ensure a high level of protection for minors throughout Europe.

In order to better assess what has already been done and what further steps might be necessary, the present report – responding to the call in Point 6 of the 2006 Recommendation – analyses the implementation and effectiveness of the measures specified in the 1998 and 2006 Recommendations in the Member States.

The report has been prepared on the basis of information supplied by the Member States in response to a questionnaire. It complements several actions in the Digital Agenda for Europe[3].

More detailed information on the responses and on specific examples of measures taken is available in the accompanying Staff Working Paper .

FINDINGS

Tackling illegal or harmful content

Content and service providers are increasingly making efforts to tackle discriminating and other illegal or harmful content, particularly through self-commitments / codes of conduct , which exist in 24 Member States[4] . As far as Internet content is concerned, some of these initiatives foresee that websites may signal their compliance with a code of conduct by displaying an appropriate label.

In addition, efforts are made in the Member States to develop and facilitate access to quality and appropriate content for minors , for instance through specific websites for children and specific search engines[5].

While there is convergence in the Member States that promoting self-regulatory measures (codes of conduct) is useful, there is persistent concern that the protection levels achieved in this field still differ significantly. Going forward, existing measures against illegal or harmful contents should be constantly monitored in order to ensure their effectiveness. For instance, reporting points for this type of content, provided by the content provider and to be used by children and parents are being developed and supported by functioning back office infrastructures, but all these initiatives lack common features and economies of scale that would increase their efficiency. |

Hotlines

The Digital Agenda for Europe calls on Member States to fully implement hotlines for reporting offensive or harmful online content by 2013.

Hotlines are meanwhile widely established and used in the Member States and in Norway. Co-funding of hotlines by the European Commission's Safer Internet Programme remains an essential support mechanism.

Some of the Member States also referred to the INHOPE Association of Internet hotlines[6], which was founded in 1999 under the former EC Safer Internet Action Plan and is now funded under the Safer Internet Programme. It covers countries beyond Europe [7] and has the goal to increase cooperation in this area. INHOPE member hotlines must comply with the INHOPE Code of Practice. Hotlines in 24 Member States are members of INHOPE[8].

A number of hotlines are part of so-called " notice and take-down procedures "[9], in which the ISPs commit to immediately remove illegal content that members of the public have reported via the hotline. 19 Member States[10] report that notice and take-down procedures have been developed and are applied.

However, there are considerable differences in the functioning of hotlines and particularly of notice and take-down procedures. This concerns the decision that certain content is illegal, the review of such decisions, tracking of its source and of the web hosting provider and in particular the notification to the competent authorities[11]. Although this was not part of the questionnaire, Bulgaria and Slovenia referred to the monitoring of hotlines[12].

The widespread establishment and networking of hotlines is encouraging, but not sufficient. In order to foster both their efficiency and more consistency amongst the Member States (e.g. best practices of interactions with law enforcement authorities), ways to make them better known and more easily accessible for Internet users, including children, and to improve their functioning and develop synergies with other related services (e.g. Helplines and Awareness Centres, 116 000/116 111 numbers) should be reflected on. Moreover, hotlines should be monitored more closely. |

Internet Service Providers (ISPs)

In general terms, ISPs are increasingly involved in the protection of minors, despite their limited liability and responsibility under the E-Commerce-Directive[13]. This applies to their legal obligations regarding illegal content, but particularly to joint voluntary commitments and adherence to codes of conduct.

However, ISP associations generally have no specific mandate regarding the protection of minors. Therefore, signature and compliance with codes of conduct for the protection of minors is generally only optional for members of such associations[14].

In addition, only eight Member States[15] and Norway reported that consumers or public authorities participated in the development of codes of conduct during the period under review; and only six Member States[16] reported that there are evaluation systems in place to assess the effectiveness of the codes.

Overall, only 11 Member States[17] and Norway deem that the self-regulation system and ISPs' codes of conduct are well-adapted to the new digital environment.

ISPs are encouraged to become even more active in the protection of minors. The application of codes of conduct should be more widespread and closely monitored. ISP associations are encouraged to include protection of minors in their mandates and commit their members accordingly. Moreover, greater involvement of consumers and authorities in the development of codes of conduct would help to ensure that self-regulation truly responds to the rapidly evolving digital world. |

Social networking sites

Social networking sites offer huge opportunities for minors. Even though they have gained importance only relatively recently, they have already transformed the way minors interact and communicate with each other.

However, social networking sites also bear a considerable risk potential, which can be summarised by the categories "illegal content", "age-inappropriate content", "inappropriate contact" and "inappropriate conduct"[18]. One promising way to counter these risks is guidelines, addressing providers of social networking sites and/or users.

Only 10 Member States[19] referred to such guidelines, and even fewer reported that there are evaluation systems in place to assess their effectiveness[20]; so "soft law" rules currently suffer from rather patchy implementation. This gap could partly be filled by the " Safer Social Networking Principles for the EU "[21] which 21 social networks have subscribed to.

Given the massive expansion of social networking sites, operators' control systems fall short of covering all the potential risks in an efficient and consistent manner. Active stakeholder engagement is encouraged, in particular through further awareness-raising as regards the risks and ways to mitigate them, wider use of guidelines, with implementation monitoring. In addition, reporting points with a well functioning back office infrastructure are increasingly being deployed on social networks in order to assist children in dealing with grooming, cyber-bullying and similar issues, but the solutions are being developed on a case-by case basis. Moreover, the use of "privacy by default" settings for children joining in social networking sites is not widespread. |

Problematic Internet content from other Member States / from outside the EU

Most Member States consider the share of harmful content originating in their own territory to be very low, the share of harmful content from other EU Member States significantly higher and the share of harmful content from outside the EU the highest[22]. As regards possible improvements, some Member States consider it would be easier to achieve further harmonised protection at European level, which they would welcome, rather than at international level[23]. Despite this, it is generally considered useful to encourage third countries to take action domestically, and a large majority of Member States and Norway favour concluding further agreements with third countries [24].

Enhanced cooperation and harmonised protection concerning problematic Internet content seem desirable. Although this content originates mostly outside the EU, some Member States consider such an approach to be more realistic at European level than by involving third countries. |

Media literacy and awareness-raising

All Member States are committed to promoting media literacy and enhancing awareness of

the risks of online media and of existing protection tools as effective preventive instruments.

In particular, there are a growing number of relevant initiatives in the Member States taking the form of public private partnerships. According to feedback from Member States, the European Commission's Safer Internet Programme and the EU Kids Online project have proven valuable frameworks in these fields[25].

Media literacy and awareness-raising initiatives are partly integrated into formal education and some efforts are being made to sensitise parents and teachers, too. However, an assessment carried out by the Commission in 2009 showed that even though the topic is included in national curricula in 23 European countries, actual delivery of such education is fragmented and inconsistent[26].

While increasing integration of media literacy and awareness-raising in school education is positive, universal coverage of all children and parents and consistency across schools and Member States remain significant challenges. |

Access restrictions to content

Restricting minors' access to content that is appropriate for their age requires two things: on the one hand, age rating and classifying content and on the other hand ensuring respect for these ratings and classifications. The latter task falls primarily within parents' responsibility, but technical systems - filtering, age verification systems, parental control systems, etc. - provide valued support.

Age rating and classification of content

The age-rating and content classification systems for audiovisual content in place are in principle considered sufficient and effective by 12 Member States[27], whereas 13 Member States[28] and Norway deem they should be improved .

16 Member States[29] and Norway responded that they have diverging age ratings and classifications for different types of media. Ten Member States[30] and Norway consider this to be a problem. Eight Member States[31] and Norway point out that there are measures or initiatives being considered to introduce greater consistency in this field.

Altogether 15 Member States[32] and Norway consider cross-media and/or pan-European classification systems for media content helpful and feasible . This is contradicted by nine Member States[33] which point to the cultural differences.

This is an area of most extreme fragmentation – the conceptions of what is necessary and useful diverge significantly between and within Member States. |

Technical systems (filtering, age verification systems, parental control systems, etc.)

Overall, there seems to be a consensus that technical measures alone cannot protect minors from harmful content and can only be one element in a bundle of measures.

Regarding technical measures aimed at avoiding potentially harmful content by ensuring respect for the relevant ratings and classifications, the Member States are divided over their usefulness, appropriateness (with a view to the right to information and possible misuse for censorship), technical feasibility and reliability [34]. In addition, there was a shared emphasis on the need for transparency as regards the inclusion of certain content in a black list and the possibility of its removal.

20 Member States[35] report that efforts have been made by industry or public authorities to develop a filtering and rating system for the Internet. 24 Member States[36] and Norway report that parental control tools are used . They are available free of charge in 15 Member States and upon payment in four Member States[37].

Moreover, there are growing efforts to inform subscribers about the available filtering and rating systems and age verification software , which is an obligation – by law or in relevant codes of conduct for ISPs or mobile operators – in 16 Member States[38].

While most Member States see scope for improving their age rating and classification systems, there is clearly no consensus on the helpfulness and feasibility of cross-media and/or pan-European classification systems for media content. Still, in view of the increasingly borderless nature of online content, ways to better align such systems should be explored further. Internet-enabled devices with parental control tools are increasingly available but the articulation with the use of appropriate content relies upon case-by case solutions that vary greatly between and within Member States. Against this background, it seems worth reflecting upon innovative rating and content classifications systems that could be used more widely across the ICT sector (manufacturers, host and content providers, etc.), while leaving the necessary flexibility for local interpretations of “appropriateness” and reflecting the established approaches to the liability of the various Internet actors. |

Audiovisual Media Services

As regards co/self-regulation systems for the protection of minors from harmful content, on-demand audiovisual media services (where such systems are in place in eight Member States, with seven having a code of conduct) are lagging behind television programmes where such systems are in place in 14 Member States, with 11 of them having a code of conduct in place[39].

The most common techniques to signal to parents the presence of harmful content and the need for parents to restrict access are on-screen icons and/or acoustic warnings immediately prior to the delivery of potentially harmful content. This is true of both television broadcasts and on-demand audiovisual media services.

Most Member States consider such signals useful, and some require them by law or their use is stipulated by codes of conduct. Less used are technical filtering devices or software , including pre-locking systems and pin codes . Age classifications and transmission time restrictions for on-demand audiovisual media services are applied only in a small number of Member States[40].

As regards the reliability of labelling and warning systems, some Member States stressed the importance of parental responsibility and the fact that such systems can only work when parents ensure their effectiveness by controlling what their children are watching.

The variety of actions carried out in this field reflects the distinctions made in the AVMS Directive but also the difficulty to come to consensual policy responses to this challenge. Universally available technical means for offering children a selective access to content on the Internet, such as parental control tools linked to age-rated and labelled content are very diverse; the solutions developed for linear/TV broadcasting (e.g. transmission times) often seem ill-adapted to Internet and other on-demand audiovisual media services. |

Video games

A total of 17 Member States and Norway consider the functioning of their age rating systems to be satisfactory [41]. With the exception of Germany, Member States rely on PEGI (Pan-European Games Information System)[42] and PEGI Online [43].

As regards online games, PEGI Online is considered to be a good solution in principle, but a number of Member States are concerned by the still limited participation of industry in this system.

Evaluation systems for the assessment of possible favourable or adverse effects of video games on minors' development or health are in place in only five Member States[44] and Norway.

As regards possible further measures to protect minors from harmful video games, the ones mentioned most were media literacy and awareness-raising , in particular to better signal the risks from video games, and to promote existing protection tools. However, only in eight Member States and in Norway are such measures integrated into school education .

The replies given by the Member States furthermore confirm the need for more action on the retail sale of video games in shops in order to deal with the "underage" sale of video games. There have been relevant awareness raising measures in six Member States and Norway[45] only and in only four Member States[46] retailers have implemented relevant codes of conduct.

While age rating systems (notably PEGI) function well in most Member States, the reported challenges include their limited application to online games and "underage" sales of games in the retail market. In addition, more awareness-raising measures (e.g. media literacy at schools) would have useful preventive effects. |

Right of reply in online media

16 Member States[47] provide for a right of reply covering online newspapers / periodicals; in 13 Member States[48], it covers Internet-based news services; in 17 Member States[49], it covers online television services; in 15 Member States[50], it covers online radio services and in nine Member States[51], it covers other online services.

Member States assess the level of protection from an assertion of facts[52] in online media and the effectiveness of the respective system(s) in place in about equal measure as sufficient and effective and as unsatisfactory.

The introduction of a right of reply covering online media in the Member States is inconsistent and differs for each type of online medium. Moreover, there is scope for improving the effectiveness of the systems in place. |

CONCLUSIONS

As a positive general result, the survey of Member States on the various dimensions of the 1998 and 2006 Recommendations shows that all Member States are conscious of the challenges for the protection of minors online and are increasingly making efforts to respond to them. A policy mix, with a significant component of self-regulatory measures, seems best suited to address in as flexible and responsive a way as possible the convergence between platforms (TV, PC, smartphones, consoles, etc.) and audiovisual content.

However, the detailed assessment of the policy responses that Member States have developed presents a landscape made of very diverse – and in a number of cases, even diverging - actions across Europe. This is in particular true of tackling illegal and harmful content, making social networks safer places and streamlining content rating schemes.

Quite often, the regulatory or self-regulatory measures also lack ambition and consistency with similar measures put in place in other Member States, or they are simply not effectively implemented in practice. A patchwork of measures across Europe can only lead to internal market fragmentation and to confusion for parents and teachers who try to identify the “do's” and “don't” to protect and empower children who go online.

This report and the detailed responses gathered in this survey of Member States[53] demonstrate that further action at European level may build on the best practices of the Member States and reach economies of scale for the ICT sector that will help children to safely reap the benefits of the constantly evolving digital world.

[1] 1998 : Council Recommendation of 24 September 1998 on the development of the competitiveness of the European audiovisual and information services industry by promoting national frameworks aimed at achieving a comparable and effective level of protection of minors and human dignity(98/560/EC, OJ L 270, 07.10.1998 P. 48–55 ( http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:31998H0560:EN:NOT))2006: Recommendation of the European Parliament and of the Council on the protection of minors and human dignity and on the right of reply in relation to the competitiveness of the European audiovisual and online information services industry of 20 December 2006 (2006/952/EC, OJ L 378, 27.12.2006, p. 72–77 ( http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32006H0952:EN:NOT))

[2] At the same time it should be ensured that all co- or self-regulatory measures taken are in compliance with competition law.

[3] COM(2010) 245 final/2: Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions – A Digital Agenda for Europe (26 August 2010 – corrected version) ( http://ec.europa.eu/information_society/digital-agenda/index_en.htm)

[4] See Staff Working Paper page 7 and footnote 27.

[5] See Staff Working Paper pages 7, 8 and footnotes 31, 32.

[6] http://www.inhope.org/gns/home.aspx

[7] Hotlines from 35 countries worldwide are members of INHOPE.

[8] See Staff Working Paper footnote 35.

[9] See Staff Working Paper pages 8, 9. For the limited liability and responsibility of ISPs according to the E-Commerce-Directive, see footnote 13 of this Report.

[10] See Staff Working Paper footnote 39.

[11] See Staff Working Paper page 9.

[12] See Staff Working Paper page 9.

[13] According to the E-Commerce-Directive (Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (Directive on electronic commerce) ( OJ L 178 , 17/07/2000 P. 0001 – 0016) ( http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32000L0031:en:NOT), ISPs have no general obligation to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity (Art. 15 (1). ISPs benefit from a limited liability for the information transmitted (Art. 12 (1)), for the automatic, intermediate and temporary storage of that information (Art. 13 (1)) and for the information stored at the request of a recipient of the service (Art. 14 (1)).

[14] See Staff Working Paper footnote 46.

[15] See Staff Working Paper footnote 48.

[16] See Staff Working Paper footnote 49.

[17] See Staff Working Paper footnote 50.

[18] See Staff Working Paper footnote 52.

[19] See Staff Working Paper footnote 58.

[20] See Staff Working Paper page 12.

[21] http://ec.europa.eu/information_society/activities/social_networking/docs/sn_principles.pdf.

[22] See Staff Working Paper footnote 60.

[23] See Staff Working Paper page 13.

[24] See Staff Working Paper page 13 and footnote 63. In terms of fighting online distribution of child sexual abuse material, the Safer Internet Programme focuses on international and European cooperation, in particular by supporting the INHOPE network of hotlines.

[25] See Staff Working Paper page 14.

[26] See Staff Working Paper footnote 65.

[27] See Staff Working Paper footnote 81.

[28] See Staff Working Paper footnote 82.

[29] See Staff Working Paper footnote 83.

[30] See Staff Working Paper footnote 85.

[31] See Staff Working Paper footnote 86.

[32] See Staff Working Paper footnote 87.

[33] See Staff Working Paper footnote 88.

[34] The Safer Internet Programme has commissioned a benchmarking study of the effectiveness of available filtering solutions available in Europe. The first results were published in January 2011. http://ec.europa.eu/information_society/activities/sip/projects/filter_label/sip_bench2/index_en.htm

[35] See Staff Working Paper page 16.

[36] See Staff Working Paper footnote 77.

[37] See Staff Working Paper page 16 and footnote 78.

[38] See Staff Working Paper footnote 76.

[39] See Staff Working Paper pages 20-22 and footnotes 93, 94, 99, 100.

[40] See Staff Working Paper pages 20-22.

[41] See Staff Working Paper footnote 107.

[42] http://www.pegi.info/en/

[43] http://www.pegionline.eu/en/

[44] See Staff Working Paper footnote 118.

[45] See Staff Working Paper pages 24, 25 and footnote 119.

[46] See Staff Working Paper footnote 120.

[47] See Staff Working Paper footnote 128.

[48] See Staff Working Paper footnote 129.

[49] See Staff Working Paper footnote 130.

[50] See Staff Working Paper footnote 131.

[51] See Staff Working Paper footnote 132.

[52] In the sense of 2006 Recommendation, Annex 1 – Indicative Guidelines for the Implementation, at national level, of measures in domestic law or practice so as to ensure the right of reply or equivalent remedies in relation to on-line media.

[53] Staff Working Paper

Top