To Block or Not to Block: European Approaches to Content Regulation, and Implications for Freedom of Expression

Chapter 3
To Block or Not to Block: European Approaches to Content Regulation, and Implications for Freedom of Expression

Yaman Akdeniz


New media historically face suspicion and are liable to excessive regulation as they spark fear of potential detrimental effects on society. This has proved true of the publication and transmission of sexually explicit content1 through the printing press,2 the telegraph, telephone, post, cinema,3 theatre,4 radio, television, satellite and video.5 During the 1990s, as attention turned to the Internet, the widespread availability of sexually explicit content stirred up a moral panic6 shared by law enforcement agencies and large sections of the media.7 Consequently, during the 2000s, attention turned to the development of peer-to-peer networks and illegal distribution of digital copyrighted content. During the early 2010s, this was followed by the development and widespread use of web 2.0 based technologies and services such as YouTube, Facebook and Twitter and the impact of these user-driven services on political movement and speech. The dissemination of so called state secrets on influential websites such as Wikileaks caused major state level embarrassment across the globe.

The regulation of Internet content whether illegal, harmful, offensive, morally objectionable, desirable or undesirable, consequently became a key focus of governments, supranational bodies and international organizations across the globe.

Today, many European states are reacting to the availability and dissemination of certain types of content through the Internet, and there is major concern about the availability of sexually explicit content including child pornography,8 racist content,9 hate speech, and terrorist propaganda on the Internet.10 This chapter will provide an overview of policies and recommendations adopted at the pan-European level covering both the European Union and Council of Europe levels with regards to illegal and harmful Internet content.

The governance of illegal as well as harmful (which falls short of illegality) Internet content may differ from one country to another and variations exist within the pan-European region.11 Harm criteria remain distinct within different jurisdictions, albeit within the same European region, with individual states deciding what is legal and illegal based upon different cultural, moral, religious, historical and legal differences and variations. The Council of Europe member states have a certain margin of appreciation in assessing whether a ‘pressing social need’ exists to introduce speech-based restrictions to their national laws based on Article 10 of the European Convention on Human Rights principles as laid down by the jurisprudence of the European Court of Human Rights.12

Furthermore, harmonization efforts to combat illegal Internet content, even for universally condemned content such as child pornography, have been protracted and are ongoing.13 Efforts to harmonize laws to combat racist content on the Internet have proved to be even more problematic.14 The availability of glorification of violence and terrorist propaganda15 on the Internet, and content which may encourage terrorist activities16 such as bomb-making instructions also triggered policy action at the international level, and new laws and policies are being developed to combat the availability of such content on the Internet. At the same time, the music industry is pressuring governments and international organizations to address the problem of Internet piracy and the distribution of pirated content through the Internet. User access blocking policies are developed in several countries including in France and United Kingdom.

However, the major question that is being faced by international organizations and state level regulators is how to control the free flow of content deemed illegal or harmful over the Internet. As mentioned above, the question becomes even more complex by the fact that different political, moral, cultural, historical and constitutional values exist between different states. For example, while countries such as Germany, France, and Austria criminalize the denial of the Jewish Holocaust, not all the member states of the European Union or Council of Europe criminalize such conduct or content. In 2008, Fredrick Töben, a well known Holocaust denier from Australia was arrested at the Heathrow Airport in London because of a pending European arrest warrant issued by the German authorities. The warrant was issued because of Töben’s writings on his website denying the existence of the Jewish Holocaust.17 Töben was subsequently released after the German government gave up its legal battle to extradite him from Britain.18 Such an example shows that certain acts may be regarded as morally wrong or considered to be offensive by many, or illegal in some countries, but nevertheless are not criminalized by law within certain jurisdictions. Evidently differences exist between states even within the pan-European region.

Once within the strict boundaries and control of individual states, whether through paper-based publications (such as pamphlets, local papers and even books), or audio-visual transmissions limited to a particular area (such as local radio), or public demonstrations and talks, currently with its digital transmission and availability through the Internet such content respects neither national rules nor boundaries. Nowadays, the increasing popularity of user driven interactive Web 2.0 applications and services such as YouTube, Facebook and Twitter seem to eliminate virtual Internet borders even further. This, undoubtedly complicates efforts to find an appropriate balance between the right to freedom of opinion and expression and to receive and impart information and the prohibition on certain types of content deemed to be illegal by state authorities or international organizations.

Typically the stance taken by governments is that what is illegal and punishable in an offline form must also be treated equally online. There are, however, several features of the Internet which fundamentally affect approaches to its governance, and while rules and boundaries still exist, enforcement of the existing rules to digital content becomes evidently problematic on the Internet.

Based on the limited effectiveness of state laws, and lack of harmonization at international level (despite some efforts at regional level19) a number of states (including in the pan-European region) started to introduce policies to block access to Internet content and websites deemed illegal which are situated outside their legal jurisdiction. In short, the new trend in Internet regulation seems to be blocking access to content if state authorities are not in a position to reach the perpetrators or criminals for prosecution, or if their request for removal or take down of such content is rejected or ignored by hosting or content providers outside their jurisdiction.

However, blocking policies are not always subject to due process principles, decisions are not necessarily taken by the courts of law, and often administrative bodies or Internet hotlines run by the private sector decide which content or website should be subject to blocking. Often blocking policies lack transparency, and the administrative bodies lack accountability. Therefore, increasingly, the compatibility of blocking action is questioned with regards to the fundamental right of freedom of expression.

This chapter will assess pan-European level of development and implementation of blocking policies. Adaptation of blocking policies from certain member states of both the EU and CoE will be used to highlight the above mentioned problems associated with blocking policies.

European Union perspectives on content regulation

Content related problems have been divided into two categories by the European Commission ever since 1996. These two categories involve illegal and harmful content. The European Commission in its October 1996 communication on Illegal and Harmful Content on the Internet stated that:

These different categories of content pose radically different issues of principle, and call for very different legal and technological responses. It would be dangerous to amalgamate separate issues such as children accessing pornographic content for adults, and adults accessing pornography about children.20

The European Commission developed an Action Plan for a safer use of the Internet in 1998,21 and suggested that ‘harmful content needs to be treated differently from illegal content’. Tackling these two distinctive categories separately was a welcome move by the European Commission even though what is illegal, or for that matter harmful was predominantly a Member States’ decision. That is why these categories have never been clearly defined by the Commission in its Action Plan.

Usually, the difference between illegal and harmful content is that the former is criminalized by national laws, while the latter is considered offensive, objectionable, unwanted, or disgusting by some people but is generally not criminalized by national laws. While child pornography could be regarded as a clear example of illegal content, Internet content that may be labelled ‘harmful’ especially with regards to children’s access includes sexually explicit material, graphically violent material, and content advocating illegal activity, such as drug use, bomb making, or underage drinking and gambling. Certain strong or extreme political views or religious views are also regarded as harmful by many state regulators. Furthermore, certain states differ in terms of labelling or categorizing certain types of content. Content advocating hate or racist views, and content involving terrorist propaganda are for example treated differently by different states.

Therefore, harm is a criterion which depends upon cultural differences and this is accepted within the jurisprudence of the European Court of Human Rights.22 It should however be noted that in the Handyside23 case the European Court of Human Rights confirmed that freedom of expression extends not only to ideas and information generally regarded as inoffensive but even to those that might offend, shock, or disturb. As established in Castells v. Spain, ‘such are the demands of that pluralism, tolerance or broadmindedness without which there is no democratic society’.24 In Editorial Board of Pravoye Delo and Shtekel v. Ukraine, the European Court stated that:

… the Internet is an information and communication tool particularly distinct from the printed media, in particular as regards the capacity to store and transmit information. The electronic network serving billions of users worldwide is not and potentially cannot be subject to the same regulations and control. The risk of harm posed by content and communications on the Internet to the exercise and enjoyment of human rights and freedoms … is certainly higher than that posed by the press.25

Nevertheless, the availability of harmful Internet content is a politically sensitive area and a cause for concern for European regulators.

European Union Perspectives on Blocking Access to Illegal Content

With regards to illegal Internet activity, the European Union developed a separate policy on combating child pornography through a Council Framework decision,26 and addressed the conduct related criminal activity through the Council Framework Decision on Attacks against Information Systems.27 The EU addressed the liability of Internet Service Providers with regards to carrying or providing access to illegal content through its E-Commerce Directive.28

Furthermore, on 28 November, 2008, the Framework Decision on combating racism and xenophobia was adopted through the Council of the European Union. The Framework Decision is designed to ensure that racism and xenophobia are punishable in all member states by effective, proportionate and dissuasive criminal penalties. The Framework Decision deals with such crimes as incitement to hatred and violence, and publicly condoning, denying or grossly trivializing crimes of genocide, crimes against humanity and war crimes.29 The crimes covered within the Framework Decision will also apply to the Internet and the Member States of the European Union had time until 28.11.2010 to transpose30 the Framework Decision into national law.

Furthermore, the European Union’s May 2006 revised Action Plan on Terrorism31 included the development of policies and measures to detect misuse of the Internet by extremist websites, and to enhance co-operation of States against terrorist use of the Internet. This initiative involved the development of a new strategy for combating radicalization as it is believed that increasingly more terrorist propaganda and information which could be useful for terrorist use will be produced and distributed over the Internet.32

Within this context the EU considered ‘adopting legal measures obliging Internet service providers to remove or disable access to the dissemination of terrorist propaganda they host’33 but this policy option has been ruled out during the Impact Assessment work done by the European Commission with regards to the proposal for a Council Framework Decision amending Framework Decision 2002/475/JHA on combating terrorism.34 The Commission also ruled out ‘encouraging blocking through the industry’s self-regulation or through agreements with industry, without the previous adoption of legal measures outlawing the dissemination of terrorist propaganda and terrorist expertise’.35

The European Commission cited ‘the issue of the speedy re-apparition of websites that have been closed down’ as the main reason for not recommending a blocking policy. The Commission argued that blocking policies are ineffective as in most cases blocked websites reappear under another name outside the jurisdiction of the European Union in order to avoid the eventuality of being closed down or blocked once more.36 The Commission also acknowledged that existing methods of filtering can be circumvented,37 and they are designed specifically for websites and are not capable of blocking the distribution of objectionable content through other Internet services such as P2P networks.

The European Commission in its assessment concluded that the removal or disablement of access to terrorist propaganda or terrorist expertise by Internet service providers hosting such information, without the possibility to open an investigation and prosecute the one responsible behind such content, appears inefficient. The Commission reached the conclusion that the dissemination of such content would only be hindered rather than eliminated.38 The Commission expressed that:

the adoption of blocking measures necessarily implies a restriction of human rights, in particular the freedom of expression and therefore, it can only be imposed by law, subject to the principle of proportionality, with respect to the legitimate aims pursued and to their necessity in a democratic society, excluding any form of arbitrariness or discriminatory or racist treatment.39

The European Commission also expressed concern with regards to the cost of implementing blocking and filtering systems by ISPs and concluded that the implementation of such a system would have direct economic impact not only on ISPs but also on consumers.40 Furthermore, partially declassified documents in relation to the secretive EU Check the Web Project which monitors the use of the Internet for terrorist purposes through Europol suggest that ‘Member States will not be obliged to monitor, interrupt or shut down specific Internet sites’ in the fight against terrorist use of the Internet.41 Therefore, blocking access to websites and Internet content is not a common policy adopted within the European Union region and there are no EU policies actively encouraging blocking access to websites.42

Blocking considered by the EU with regard to combating child pornography

However, despite there exists no such general policy at the EU level, more recently with regards to amending its policy framework with regards to combating the sexual abuse, sexual exploitation of children and child pornography, the European Commission proposed to have EU wide mechanisms to block access from the Union’s territory to Internet websites identified as containing or disseminating child pornography especially where the original materials are not located within the EU Member States’ jurisdiction.43 The Draft Council Framework Decision, through Article 18, stated that Member States would take the necessary measures to enable the competent judicial or police authorities to order or similarly obtain the blocking of access by Internet users to Internet websites containing or disseminating child pornography, subject to adequate safeguards.

While this blocking measure was proposed, a European Commission Staff Working Document referred to the risks of blocking access to content without a legal basis and recommended that to respect fundamental rights such as that of freedom of expression (Article 10 ECHR, Article 11 of the EU Charter), an interference with such an important right needs to be prescribed by law and be necessary in a democratic society.44 The European Commission Staff Working Document argued that the ‘proportionality of the measure would be ensured, as the blocking would only apply to specific websites identified by public authorities as containing such material’.45 On the one hand, the requirement for ‘a legal basis’ for blocking is a desired and welcome move from a European Convention on Human Rights point of view. On the other hand, this highlights the fact that currently many hotlines within the European Union region function and order blocking access to content and websites without a legal basis, triggering legitimate accusations of vigilantism. These fundamental concerns needed to be addressed by the European Union before adopting a blocking policy even though this would only apply to content involving child pornography.

Having said that the above mentioned European Commission Staff Working Document further stated that ‘there is also a risk, depending on the technology used, that the systems in place may occasionally block legitimate content too’46 which raised further concerns for proportionality.

In February 2011, the European Parliament’s Committee on Civil Liberties, Justice and Home Affairs Committee (LIBE) adopted a text47 in response to the European Commission’s proposal on Internet blocking.48 According to the amendments made by LIBE ‘child pornography or child abuse material on the web must be removed at the source in all EU countries’.49 The Committee, therefore, did not recommend ‘mandatory blocking’ of websites containing child pornography but rather took the position that such content should be taken down entirely. However, where removal is impossible, e.g. because websites are hosted outside the EU jurisdiction or where the state that hosts the servers in question is unwilling to co-operate or because its procedure for removing the material from servers is particularly long, Member States ‘may take the necessary measures in accordance with national legislation to prevent access to such content in their territory’.50 This would mean that EU Member States may, if necessary, decide to introduce measures involving blocking. National measures preventing access ‘must be set by transparent procedures and provide adequate safeguards, in particular to ensure that the restriction is limited to what is necessary and proportionate, and that users are informed of the reason for the restriction’.51 Content providers and users must also be informed of the possibility to appeal, and to whom to appeal under a judicial redress procedure. It is important to mention that, according to the Committee, the EU must also co-operate with third countries to secure the prompt removal of such material from servers hosted in those countries.

The European Parliament voted on 27 October 2011, and adopted a compromise amendment to the initial proposal. The adopted amendment corresponds to what was agreed between the three European institutions (Council, the European Parliament and the Commission).52 The amended version of the blocking measure is provided below:

Article 25: Measures against websites containing or disseminating child pornography: (1) Member States shall take the necessary measures to ensure the prompt removal of webpages containing or disseminating child pornography hosted in their territory and to endeavour to obtain the removal of such pages hosted outside of their territory. (2) Member States may take measures to block access to webpages containing or disseminating child pornography towards the internet users within their territory. These measures must be set by transparent procedures and provide adequate safeguards, in particular to ensure that the restriction is limited to what is necessary and proportionate, and that users are informed of the reason for the restriction. Those safeguards shall also include the possibility of judicial redress.53

The new Directive 2011/92/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA was published in the Official Journal of the European Union on 17 December 2011.54 Member States shall bring into force the laws, regulations and administrative provisions necessary to comply with this Directive by 18 December 2013. The European Commission shall, by 18 December 2015, submit a report to the European Parliament and the Council assessing the extent to which the Member States have taken the necessary measures in order to comply with this Directive, accompanied, if necessary, by a legislative proposal.

EU Member States Blocking Practices

While a general blocking policy does not exist at the European Union level, a general monitoring obligation cannot be imposed upon ISPs through Article 15 of the EU E-Commerce Directive.55 In L’Oréal the European Court of Justice stated that:

it follows from Article 15(1) of Directive 2000/31, in conjunction with Article 2(3) of Directive 2004/48, that the measures required of the online service provider concerned cannot consist in an active monitoring of all the data of each of its customers in order to prevent any future infringement of intellectual property rights via that provider’s website.56

Despite having no EU-wide general blocking policy, nor an obligation to monitor at the ISP level, EU Member States started to develop their own blocking policies and practices. In Germany, a general blocking policy with regard to child pornography was considered by the parliament. However, in April 2011, after almost a year of discussions, the German government decided that removal rather than access blocking will be the policy to tackle the problem of online child pornography.57 During discussions there were concerns that merely blocking material could open the door to wider censorship on the Internet.58 However, it should also be noted that during 2002, North Rhine Westphalia, Germany’s most populous state, issued a blocking-order to prevent German-based ISPs from providing access to Web sites based outside Germany (mainly in the US) if those sites host racist and neo-Nazi content.59 The blocking-order affected approximately 76 ISPs within that region.60 Although there have been legal cases and appeals surrounding the blocking-orders, a number of administrative courts have ruled that German authorities can continue to ask ISPs to block such websites. Between 2002–2004 the Düsseldorf District Administration issued 90 ordinances against Internet providers in North Rhine – Westphalia forcing them to block access to certain websites with right-wing extremist content. More recent statistics are not available but the German authorities continue to issue blocking orders, however as highlighted above, the effectiveness of such orders is questioned by the European Commission assessment of blocking as a policy.

Furthermore, the German authorities through the hotline monitor the extent of the criminal Internet activity with regards to propaganda crimes.61 In 2004, Jugendschutz. Net’s activities resulted in the blocking or removal of 131 extreme right-wing websites.62 During 2007 contacted service and content providers 252 times and was successful in the closure of websites and take down of content in 232 occasions.63 In 2008, encountered 1,707 right-wing extremist websites online and more than 1,500 illegal videos and profiles with extreme right-wing content. The organization claimed success in the removal of content in 80 percent of all illegal cases.64

Germany is not the only EU Member State where blocking has been used as a measure. In the United Kingdom, British Telecom (‘BT’) in partnership with the Internet Watch Foundation (‘IWF’), developed the CleanFeed Project65 in late 2003. This follows the decision of the IWF to assist its subscribing ISP members in filtering potentially illegal content from their client services through the use of the Child Abuse Images URL service.66 The CleanFeed Project is aimed towards blocking access to any images or websites that contain child pornography within the IWF database. Customers of BT (and other UK ISPs that use the system) are prevented from accessing the blocked content and websites. In 2006, BT claimed that its CleanFeed system was blocking 35,000 attempts to access websites known to carry child pornography every day.67 According to the Register, ‘the announcement caused a flurry of media attention raising fears that thousands of people in the UK are trying to access illegal child abuse websites’.68 Subsequently, the British government attempted to ensure that all British ISPs offering broadband internet connectivity to the UK public use the CleanFeed system to restrict access to the sites identified by the IWF by the end of 2007.69 At present, the use of such a blocking system by British ISPs is voluntary and there is no legal requirement to implement such a system, but it is estimated that the ‘biggest UK ISPs who between them provide over 90 percent of domestic broadband connections are either currently blocking or had plans to implement blocking systems to by the end of 2006’.70 The government has hinted that it will review ‘the options for stopping UK residents accessing websites on the IWF list’71 if ISPs elect not to use blocking systems.

In December 2008, the Internet Watch Foundation blocked access to Wikipedia from the UK because of a single image (had been available on the Internet for years) involving the cover of an album called Virgin Killer by German heavy metal band Scorpions.72 Even though the IWF revoked its decision after five days subsequent to an appeal by the Wikipedia Foundation73 the incidence highlighted problems associated with this kind of privatized unaccountable regulation. It is often forgotten that the IWF does not have legal authority to require ISPs to block access to such content, and ISPs did not have to comply with a decision to block access to Wikipedia. It is therefore surprising that no British ISP challenged the decision.

Similarly, during January 2009, it was reported that access to the Internet Archive (Wayback Machine)74 which includes around 85 billion pages had been blocked from the UK subsequent to a blocking decision issued by the IWF.75 The incidence was described as a technical hitch but once again raised questions with regard to the IWF and lack of due process principles or legal basis with regards to the issuing of blocking decisions at the UK level.

In terms of copyright infringements, in April 2010, the Chancery Division of the High Court in Twentieth Century Fox Film Corp and others v Newzbin Ltd76 held that the claimants would be granted the injunction sought since the defendant’s website service had infringed the claimants’ copyrights. The service had authorized acts of infringement, had entered into a common design to infringe with those members and had communicated the claimants’ copyright works to the public. Subsequently, on 28 July 2011 the High Court of Justice decided that77 British Telecom (‘BT’) should block access to the website known as Newzbin2, its domains and sub-domains and including and any other IP address or URL whose sole or predominant purpose is to enable or facilitate access to the Newzbin2 website. The Court decided by reference to section 97A of the Copyright, Designs and Patents Act 1988.78 The technical details for the blocking measure were adopted in a subsequent judgment on 26 October 2011:79

(i) IP address re-routing in respect of each and every IP address from which the said website operates and which is notified in writing to the Respondent by the Applicants or their agents; and (ii) DPI-based80 URL blocking utilizing at least summary analysis in respect of each and every URL available at the said website and its domains and sub-domains and which is notified in writing to the Respondent by the Applicants or their agents.81

In February 2012, the High Court in Dramatico Entertainment Ltd & Ors v British Sky Broadcasting Ltd & Ors82 ruled that the operators of the infamous The Pirate Bay (TPB) website induce, incite or persuade its users to commit infringements of copyright, and that they and the users act pursuant to a common design to infringe. The Court also ruled that ‘in this regard that the operators profit from their activities’.83 Therefore, in May 2012, in Dramatico Entertainment Ltd & Ors v British Sky Broadcasting Ltd & Ors,84 the High Court ordered the blocking of the Pirate Bay website from within the UK through the servers of six ISPs including British Sky Broadcasting Limited, British Telecommunications PLC, Everything Everywhere Limited, Talktalk Telecom Group PLC, Telefónica UK Limited and Virgin Media Limited. The Court also ruled that IP address blocking in addition to DNS blocking was appropriate in this case as the Pirate Bay website IP address is not a shared IP address.

In France, since March 2011, it is possible to require ISPs to block websites containing child pornographic content under the LOPPSI Project (Law on Guidelines and Programming for the Performance of Internal Security).85 A blacklist of websites, not made public, is established by the administration. ISPs in turn are required to block access to these sites.

In Finland, since January 2007, the Act on Measures to Restrict the Distribution of Child Pornography86 restricts access to child pornography on the Internet through confidential blacklists prepared by the police. Some ISPs, since early 2008, decided to use the police-maintained blacklist to block access to websites containing child pornography. In June 2011, it was reported that the Helsinki Administrative Court had ruled that domestic websites may not be placed on the blocking blacklist maintained by the police.87 The administrative court action started in February 2008 when (‘’) website was included to the police maintained blacklist and has remained on the list ever since. discovered a large part of the blacklist and circulated the findings on its website.88 The revealed that the top five Google search results for ‘gay porn’ were all blacklisted even though there was nothing related to children on those sites. The World Wide Web Consortium’s website89 and the memorial page of a deceased Thai princess was among the blacklisted websites. The police, however, accused the website owner of distributing child pornography and eventually put the website on the blacklist. The website owner therefore lodged an appeal with an administrative court because his website was blacklisted without a valid legal basis.

In Italy, the competent judicial authority can order seizure, either to prevent an offence or to collect evidence, of a website with illegal content or which is used to commit an offence.90 Since 2006, online gambling has been permitted only via state-licensed websites, and ISPs are required to block access to international or unlicensed gambling sites identified on a blacklist compiled by the Autonomous Administration of State Monopolies (AAMS).91 As of May 2012, access to 4,025 gambling sites are blocked from Italy. The authority transparently makes the updated blocked gambling websites list available through its website.92 A similar blacklist for known child pornography websites is maintained by the National Center for the Fight against Child Pornography on the Internet within the Postal and Communications Police Service93 since February 2006.

In Norway, access to websites depicting child sex abuse has been blocked at ISP level since 2004 with the establishment of the Child Sexual Abuse Anti Distribution filter (CSAADF). The law enforcement agency NCIS Norway evaluates and verifies illegal sites and provides a list of domains to the ISPs based on a written agreement between NCIS Norway and the ISPs. The agreement has been developed by the Internet Service Providers Association of Norway.94 The number of websites subject to blocking varies on average between 800–1200 websites at any given time. With regards to Internet piracy, a Norwegian District Court ruled that there were no grounds for ordering the Norwegian ISP Telenor to block access to the popular Pirate Bay website in November 2009. The Court of Appeal rejected an appeal filed by the music and film industry in February 2010. The appeal court held that Telenor did not unlawfully contribute to the infringement of copyright by providing access to the Pirate Bay website.95

It should also be noted that leaked secret ‘child pornography’ access blocking blacklists published on Wikileaks from Finland,96 Denmark97 and Italy98 (as well as from China,99 Thailand,100 Australia101) have demonstrated that majority (apart from Denmark) censor non-child pornographic content, including political content. Therefore, often, secrecy and unaccountability of such private hotlines lead into blocking and censorship of content which is not deemed to be illegal. In the absence of openness, and transparency of the work of the hotlines as well as the secrecy surrounding the blocking criteria and the list of blocked websites, concerns will continue to exist about the work of such organizations.

Blocking and Filtering Related Litigation at the European Court of Justice Level

Furthermore, policies and practices adopted in a number of European countries resulted with legal challenges through the courts.

In Belgium, the courts may, under national legislation, issue an order for any infringement of an intellectual property right to be brought to an end. In particular, the legislation provides that, where a third party uses the services of an intermediary to perpetrate an infringement of that type, the courts are authorized to issue such an order against that intermediary. The Société belge des auteurs compositeurs et éditeurs (Sabam) applied for interim relief against Scarlet Extended SA, an Internet Service Provider.102 Sabam sought first of all a declaration that the copyright in musical works contained in its repertoire had been infringed because of the unauthorized sharing, through the use of Scarlet’s services, of music files – in particular, by means of peer-to-peer software. Sabam also sought an order requiring Scarlet to bring such infringements to an end, on pain of a penalty payment, by blocking or making impossible the sending or the receiving by its customers in any way of files containing a musical work, using peer-to-peer software, without the permission of the copyright holders. By a judgment of 26 November 2004, such copyright infringements were found to have taken place. After a report had been obtained from a technical expert, Scarlet was ordered, by another judgment, delivered on 29 June 2007, to bring those copyright infringements to an end by making it impossible for its customers to send or to receive in any way, by means of P2P software in particular, files containing a musical work in Sabam’s repertoire, and to do so within a period of six months, on pain of a penalty payment of €2,500 per day should Scarlet fail to comply with the judgment. Scarlet has appealed against that judgment to the Court of Appeal in Brussels, which must decide whether to uphold the measure adopted against Scarlet. In that context, the Court of Appeal sought a ruling from the Court of Justice on whether the European Union law and, in particular, the fundamental rights guaranteed by the Charter of Fundamental Rights, permit a national court to order an ISP to install a system for filtering and blocking electronic communications.103

In its ruling of 24 November 2011,104 the European Court of Justice stated that the EU law precludes the imposition of an injunction by a national court which requires an Internet Service Provider to install a filtering system which applies indiscriminately to all its customers exclusively at the expense of the ISP with a view to preventing the illegal downloading of files containing a musical, cinematographic or audio-visual work in respect of which the applicant claims to hold intellectual-property rights. According to the Court, the injunction imposed on the ISP concerned requiring it to install the contested filtering system would oblige it to actively monitor all the data relating to each of its customers in order to prevent any future infringement of intellectual-property rights. It follows that that injunction would require the ISP to carry out general monitoring, something which is prohibited by Article 15(1) of Directive 2000/31.105

Another case that reached the European Court of Justice from Belgium involves Sabam, the Belgian management company which represents authors, composers and publishers of musical works as in the above mentioned case and Netlog NV runs an online social networking platform based in Belgium. Users of this service can also share audio-visual files through their own profile pages. Sabam claimed that unauthorized distribution of music files took place without their consent nor a licence fee paid by Netlog NV. In June 2009 Sabam requested from the President of the Court of First Instance of Brussels (Belgium) that Netlog be ordered immediately to cease unlawfully making available musical or audio-visual works from Sabam’s repertoire and to pay a penalty of €1000 for each day of delay in complying with that order. In that regard, Netlog submitted that granting Sabam’s injunction would be tantamount to imposing on Netlog a general obligation to monitor, which is prohibited by the E-Commerce Directive. In those circumstances, the Court of First Instance of Brussels made a reference for a preliminary ruling to the Court of Justice. The Belgian Court asked the European Court of Justice whether European Union law precludes a national court from issuing an injunction against a hosting service provider, such as an owner of an online social network, which requires it to install a system for filtering information stored on its servers by its service users, which applies indiscriminately to all of those users, as a preventative measure, exclusively at its expense and for an unlimited period.

The European Court of Justice ruled in February 2012 in the case of Sabam v. Netlog NV106 that the owner of an online social network cannot be obliged to install a general filtering system, covering all its users, in order to prevent the unlawful use of musical and audio-visual work. According to the Court, such a preventive monitoring would require active observation of the files stored by users with the owner of the social network. Accordingly, the filtering system would require the owner to carry out general monitoring of the information stored on its servers, something which is prohibited by the E-Commerce Directive. The European Court of Justice, therefore, did not approve of an injunction would result in a serious infringement of Netlog’s freedom to conduct its business since it would require Netlog to install a complicated, costly, permanent computer system at its own expense. The Court was also concerned that the injunction could potentially undermine freedom of information, since that system might not distinguish adequately between unlawful content and lawful content, with the result that its introduction could lead to the blocking of lawful communications.

Notice Based Liability System

Only gold members can continue reading. Log In or Register to continue