Tuesday 25 February 2014

Olivia O'Kane - The curious case of user-generated comment


Online responsibility is governed by the Electronic Commerce (EC Directive) Regulations 2002 (SI 2002 No 2013) and The Defamation Act 1996, commonly known as the safe harbour defences.

[Our original post on the Delfi AS v Estonia judgement here]

LEGAL FRAMEWORK

The following legislation provides that no liability shall arise until and after Information society Service Providers [“ISPs] are put on notice of the material complained about:
1. Pursuant to Regulation 19 of the Electronic Commerce (EC Directive) Regulations 2002 where:
an ‘information society service’ is provided which consists of the storage of information provided by a recipient of the service the service provider is not liable for damages or for any unlawful activity as a result of that storage where the storage provider “does not have actual knowledge of unlawful activity or information” and the provider “upon obtaining such knowledge or awareness acts expeditiously to remove or to disable access to the information”.

Directive 2000/31/EC at Article 15 provides that:
“A Member State shall not impose a general obligation on providers….to monitor…nor a general obligation actively to seek facts or circumstances indicating illegal activity”.
Thus, Regulation 19 of the E-Commerce Directive provides that ISPs are exempt from liability were posts or other information are stored or passed onto users. The exemption applies as long as they are not involved in the creation or editing of the material and as long as the ISP removes the material complained of quickly once notified.

The method of taking down complained of material is called the “notice and take down procedure” which in most cases if complied with will absolve the ISP from liability.

The Directive defines an ISP as one which is:
“normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services”.
Regulation 19 provides a qualified immunity for ISPs in respect of material provided by third parties and which is hosted on the ISP's servers. This immunity, which applies in respect of both civil and criminal liability, is subject to the following conditions:

  • The service provider has no actual knowledge of the content in question. Once the service provider is in receipt of actual knowledge of the illegality, it must act to remove or disable access to the material as quickly as possible;
  • The service provider is not aware of facts or circumstances from which the illegality of the content in question should have been apparent; and
  • The user responsible for providing the content in question was not "acting under the authority or the control of the service provider".

2. Section 1 of the Defamation Act 1996 also provides the safe harbour defence to non publishers but also only up until the point of notification.

The defence of 'innocent dissemination' of defamatory material is available to a person who is:

  • not the "author, editor, or publisher" of the defamatory statement;
  • took reasonable care in relation to the publication of the statement in question;
  • did not know and had no reason to believe that the statement in question was defamatory or that he caused or contributed to the publication of a defamatory statement.

Further, under Section 1(3)(e), of the Defamation Act 1996 an intermediary is not considered to be the "author, editor, or publisher" of a defamatory statement:
"if [the intermediary] is only involved as the operator of or provider of access to a communication system by means of which the statement is transmitted, or made available, by a person over whom he has no effective control ."
THE COURTS

In Bunt v Tilley and others [2006] EWHC 407 in dealing with a mere conduit the court held that:
"an ISP which performs no more than a passive role in facilitating postings on the internet cannot be deemed to be a publisher at common law”.
It held that to characterise information as ‘unlawful’, a person would need to know about the strength or weakness of the available defences.

In Google France SARK, Google Inc v Louis Vuitton Malletier SA and others (2010) the Court of Justice of the European Union [“CJEU”] found that in deciding whether Google Search fell within the definition of an ISP:
"an internet referencing service constitutes an information society service consisting in the storage of information supplied by the advertiser".
The courts have held that a mental element (knowing the material complained of exists pre notification) is necessary for a person to be fixed with responsibility at common law.
“cover only cases in which the activity of the information society service provider is ‘of a mere technical, automatic and passive nature’, which implies that that service provider ‘has neither knowledge of nor control over the information which is transmitted or stored”.
This need for “knowing involvement in the process of publication of the relevant words” was necessary as otherwise they would not know that the controversial material was unlawful.

In Kaschke –v- Gray & Hilton ([2010] EWHC 690 (QB)), the court held:
“First there is no reason in principle why the operation of a chat room should be incapable of falling within the definition of the provision of an information society service consisting of the storage of information. Thus in principle there is no reason why it should not be an activity intended to be protected by Article 14 of the E-Commerce Directive and eligible for the exclusion of liability conferred by Regulation 19. 
Second it is not necessarily a bar to entitlement to the protection conferred by Regulation 19 (which Member States were intended to provide by Article 14) that the provider of an information society service consisting of storage of information is also engaged in an activity on the same website which is either not an information society service or if it is which does not consist of the storage of information. 
The law acknowledges that storage of user generated content is capable of attracting the Regulation 19 protection where that element is not moderated.”
In UK/EU law there is no general obligation that a host or ISP should monitor User Generated Content (UGC). But following the notice and take down procedure there is an obligation to comply with court injunctions or respond to letters of complaint.

What this has led to is an assumption of liability for those who moderate and monitor online content, thereby constituting actual knowledge upon them and responsibility for the unlawful material.

In AB –v- Facebook Ireland Limited NIQB [2013] the High Court in Belfast recognised that Facebook is not a publisher in law, the case against Facebook was dismissed and libel damages and costs were awarded in favour of the claimants against the Facebook users only.

Liability will therefore depend on who has created the online content. We are now all potential publishers. The author of the online content or the publisher can be private individuals, commercial businesses as well as mainstream media outlets.

For those merely hosting material without moderation or editing there is no liability before receiving notice of unlawful material for user generated content but once on notice there is an obligation to take it down within a reasonable period of time.

The Courts held in Tamiz v Google Inc [2013] EWCA that five weeks between notice and take down was a reasonable period of time. It also held that prior to notification there could not be primary or secondary liability for defamatory content.

Departing from the decision in Delfi AS v Estonia (Application No. 64569/09), in J19 & J20 –v- Facebook Ireland NIQB [2013] ([2013] NIQB 113) the High Court in Belfast chose to depart from the “robust” Strasbourg approach to service providers and their liability for comments hosted on their sites. The Belfast judge considered such liability as not consonant with the EC Directive on E-Commerce.

STRASBOURG DEPARTS FROM THE EUROPEAN UNION

In Delfi the Strasbourg Court “robustly determined” that an internet news portal could be liable for offensive UGC comments which were posted below a news article. The news article reported the decision of an Estonian ferry company to change the route of a ferry line.

The member state courts in Estonia had found that Delfi should have prevented clearly unlawful comments from being published in the portal’s comment section, even though Delfi had taken down the offensive comments as soon as it had been notified about them. The UGC content was online for about 6 weeks prior to complaint.

When Delfi lodged the complaint with the European Court, the court concluded unanimously that the domestic courts’ findings were justified and a proportionate restriction of Delfi’s right to freedom of expression.

The Strasbourg court considered that Delfi could have foreseen that their story would spark fierce debate including offensive UGC and that Delfi could have been more proactive in ensuring the appropriate monitoring tools would be available if necessary. It held that by allowing non-registered users to post unlawful comments meant that Delfi assumed a certain responsibility for the users and that it:
“could have realised that it might cause negative reactions against the shipping company and its managers with a higher than average risk that the negative comments could go beyond the boundaries of acceptable criticism and reach the level of gratuitous insult or hate speech.”
The Court therefore found that the prior automatic filtering of certain “vulgar words” together with the notice and take down system that was in place, was insufficient to protect third party rights.

The decision effectively defines Delfi as a publisher of user generated content because it had control over the comments section of the website and in spite of the take down procedure in place. The Strasbourg court considered that to put the onus on complainants to identify the authors of the anonymous content was too disproportionate a burden whereas the burden on Delfi was in their view more proportionate.

The Strasbourg Court appears to have ignored European Union law and failed to appreciate the purpose and benefit of the E-Commerce Directive which gives an incentive to platforms such as Delfi to remove content upon notice of its illegality in exchange for immunity from liability. If these platforms fail to take action upon notice, then they could face liability for the content. Delfi removed the material on the same day that it had received the complaint and thus should have been protected by the safe harbour defence.

As discussed above, had this case have been heard in the UK courts the safe harbour defence would have enabled liability immunity to Delfi as an information storage service provider and would have provided exemption from liability for the 6 weeks that it had no knowledge of the material complained of.

The decision in Delfi is not only a blow to freedom of expression but it is a decision that begs to be challenged on appeal by the Grand Chamber.

In J19 and Another v Facebook Ireland [2013] NIQB 113 the Honourable Mr Justice Gillen considered Delfi and said:
“This case may well be fact sensitive and indeed subject to an appeal to the Grand Chamber. . . Re reading this case I experienced a sense of shrinking relevance to the instant case. It is distinguishable from the facts now under consideration if for no other reason than that the parties responsible were identifiable. 
It is perhaps also important to appreciate that in the Delfi case the E-Commerce Regulations 2002 did not play a part…” 
The ECHR ruling does not in my view therefore suggest that such an interpretation of the EU Directive is incorrect. It is still up to the national laws of a country to decide if a company is liable in the first place and the European Court of Justice remains an alternative avenue for appeal.”
The decision has generated widespread criticism and Delfi’s application for a referral to the Grand Chamber has been supported by a coalition of 69 news organisations including the MLDI, Google, Forbes, News Corp., Thomson Reuters, the New York Times, Bloomberg News, Guardian News and Media, the World Association of Newspapers and News Publishers, Conde Nast, the European Newspaper Publishers’ Association, the European Publishers Council, Greenpeace, the Center for Democracy and Technology, ARTICLE 19 and national media outlets and journalists associations from across Europe.

See letter to the President of the Court here.


Olivia O’Kane (@OliviaOKane1) is a contributor at Media Law NI and a specialist media lawyer at Belfast solicitors Carson McDowell. Earlier posts featuring Olivia include a discussion on the key changes made by the Defamation Act 2013 here. On Justice Horner's comments on Facebook here. On Internet Service Providers and User Generated Content here. Olivia's roundup of media law in 2013 here. Comments on Northern Ireland libel laws in the Ulster Business magazine here. See Olivia in the Inforrm blog here. Read an interview with Olivia on social media for lawyers here. Olivia is also listed on Ireland's legal tweeters on Defero Law here. On Twitter here.

No comments:

Post a Comment