The Grand Chamber of the Court of Justice of the European Union (‘CJEU’) has clarified whether a search engine or the person making a right to be forgotten request has the burden of proving the inaccuracy of the information in contested Internet search results. The question of whether preview images (so-called ‘thumbnails’) should be removed was also examined by the court.
The individuals involved in the case were anonymised as TU and RE. TU is a member of the board of directors and the shareholder of an investment company. RE was TU’s cohabiting partner and held a role in one of the subsidiary companies. In 2015, three articles were published on a website (anonymised to ‘g-net’ in the ruling). The articles criticised the investment model of TU’s companies and showed photographs of TU driving a luxury car, in a helicopter and in front of an airplane. There was also a photograph of RE in a convertible car.
The website was operated by G-LLC, an organisation registered in the United States, which stated its aim was to investigate fraud. There were various publications that accused G-LLC of deliberately publishing negative articles about companies and then attempting to blackmail those companies by offering to delete the articles for payment. TU and RE made right to be forgotten requests to Google LLC, asking for the articles be de-linked against their names, and also that the photographs of them in the form of thumbnails be removed.
Google refused on the basis that the articles and photographs were available in a professional context and that the alleged inaccuracy had not been proven. On appeal, Germany’s Federal Court of Justice made a request for a preliminary ruling on the interpretation of Article 17(3)(a) of the GDPR. Specifically, whether it is for the data subjects to prove the alleged inaccuracy and the extent to which they must provide evidence to make their request for removal.
Article 17(3)(a) of the GDPR provides an ‘exemption’ from a right to be forgotten request where the processing of the personal data is necessary for the exercise of the right of freedom of expression and information. There are a number of Articles in the Charter of Fundamental Rights of the European Union (‘the Charter’) that must be considered. Article 7 provides for the right to respect for private life. Article 8 provides the right to protection of personal data. Article 11 confers a right to freedom of expression and information. Article 16 provides the right to freedom to conduct a business in accordance with Union Law.
The Grand Chamber re-iterated the findings in Google Spain and Google C-131/12, that a search engine is distinguishable from the original publisher and plays a “decisive role in the overall dissemination of those data” and can enable an Internet user to establish a “detailed profile of [a] data subject”. In this respect, a search engine can significantly affect – beyond that of the publication by the original publisher – the fundamental right to privacy and protection of personal data.
The Grand Chamber pointed out that where the controller is not the publisher, the weighing up exercise that is necessary is the consideration of Articles 7 and 8 against Articles 11 and 16 of the Charter. It noted that in accordance with Article 52(3) of the Charter, Article 7 is the direct equivalent to Article 8 of the European Convention on Human Rights (‘ECHR’). Article 11 is likewise interpreted as the same as Article 10 ECHR. As such the case law of the European Court of Human Rights (‘ECtHR’)is applicable in determining the balance between Articles 7 and 11 of the Charter.
The ECtHR case of Satakunnan Markkinapörssi Oy and Satamedia Oy v Finland was cited as providing a number of relevant criteria: the contribution to a debate of public interest, how well-known the data subject is and his or her past reputation, the subject of the news article, the content and form of the publication, the publication’s consequences, the manner and circumstances in which the information was obtained and its veracity. Of note is that the balance, and therefore the assessment of these criteria, does not occur at all where the information is inaccurate. This is because the right to inform and be informed does not include the right to disseminate inaccurate information.This means that where a data subject objects to the indexing of inaccurate information in a search engine’s results, the search engine provider is required to de-link that information without further consideration.
The question is therefore to what extent, and how must the data subject prove inaccuracy? First, the Grand Chamber points out that there is a distinction between factual assertions and value judgments (i.e. opinions) and that the latter will not fall within the right to removal. The data subject “has to provide only evidence that, in the light of the circumstances of the particular case, can reasonably required … to try to find to establish that manifest inaccuracy” and a final or interim judicial decision against the publisher of the article is not required.
A search engine provider assessing the freedom of expression exemption in Article 17(3)(a) of the GDPR is not required to actively look for facts “which are not substantiated” by a de-listing request. It is not required to investigate the facts, or to engage with the publisher of the information to establish accuracy. The court was of the view that if a search engine provider were required to do so, it could potentially affect the right to freedom of expression, as it may adopt a “quasi-systematic” removal policy in order to avoid the burden of an investigation.The onus is therefore on the data subject to prove inaccuracy.
To meet the threshold, a data subject must provide “relevant and sufficient evidence capable of substantiating [the] request” and of “establishing the manifest inaccuracy of the information found in the referenced content”. This evidence must relate to the main contention that is objected to and not to a minor portion of the contested information. The court equated such evidence as equal to a judicial decision that the information “is at least prima facie inaccurate” and that if the data subject provides this, the search engine “is required to accede”. If the inaccuracy is “not obvious” from the evidence provided by the data subject, the search engine does not need to comply with the request and if the information contributes to a debate of public interest, it may place importance on Article 11 of the Charter in its weighing-up exercise.
In the main proceedings, g-net had taken down the photographs but had stated on its website that ‘it is currently impossible to access’ the articles. This led the court to conclude that it could not be ruled out that the photographs could be re-posted in the future and referenced by Google in a search result. Google’s system of referencing places an abbreviated title of an internet page and its web location under each image thumbnail that is listed in its results.
The question referred was whether a search engine must take into account the original context of a publication when deciding whether to de-list the associated thumbnails depicting the image of a data subject. The court stated that the factors raised in the case of Satamedia (listed above) are relevant when considering photographs in this context. Given that the thumbnails also provide a link directly to the information about the data subject, they also “play a decisive role in dissemination of that information” and can significantly interfere with the right to privacy.
Further, a person’s image is considered to be of particular importance, seen to represent a “chief attribute” of a person’s personality and “one of the essential components of personal development” especially in an intimate context. This is a well-established principle – see Von Hannover v Germany (2012) 55 EHRR 15.
The CJEU concluded that a search engine must undertake the balancing of competing rights to decide whether displaying the images is necessary for internet users to exercise their right to access information, and this is a separate consideration from the original article (my emphasis). However, if the search engine operator decides that the original article should be de-listed, then it follows that the thumbnails should be de-listed as well. The court pointed out that the processing that a search engine provider undertakes is distinct from web publishers, and it should bear in mind that the aggregation of images of an individual may result in a “particularly intense” interference with a person’s fundamental rights.
In the case in hand, the court commented that the photographs in the thumbnails provided “little informative value” and that they ought to be removed, notwithstanding that the original article may remain.
While the need to balance the rights of data subjects to privacy and data protection against those of a publisher under Article 17(3)(a) GDPR is nothing new to practitioners, this ruling nevertheless provides useful guidance. The decision is not binding on UK courts; however, it is likely to be reflected upon in any right to be forgotten litigation.
Data subjects need to provide at least prima facie evidence if relying on inaccuracy, that is relevant to the main contention of the information that they seek to de-list. Thumbnails should automatically follow the de-listing of an article, but they are also a separate consideration within themselves. The ruling also suggests that when assessing the publication of photographs on the Internet, not only must the harm, or invasion of privacy to individuals be considered, but also if there is any informative value in them to Internet users to justify the processing.
If you are interested in any further information or advice, please contact my clerks on 0300 0300 218 or firstname.lastname@example.org
 EU: C:2014:317.
 Paragraph 50.
 Paragraph 51.
 Paragraph 59.
 Paragraph 60.
 Paragraph 65.
 Paragraph 66.
 Paragraph 68.
 Paragraph 70.
 Paragraph 71.
 Paragraph 72.
 Paragraph 73.
 Paragraph 93.
 Paragraph 95.
 Paragraphs 96 and 101.
 Paragraph 104.
 Paragraph 106.