Last week an important case was decided in the High Court on the so-called ‘Right to be Forgotten’. The Claimants, NT1 and NT2, are businessmen who asked Google to remove links to search results that revealed their past convictions. The High Court judgment is lengthy and comprehensive. For those who wish to analyse the case in more detail, be warned, it is 65 pages and 230 paragraphs long. This article is simply a summary of the pertinent points of the judgment.
The ‘Right to be Forgotten’ emerged from Google Spain SL, Google Inc v Agencia Española de Protección de Datos (AEPD) and Mario Costeja Gonzalez, Case C-131/12 (‘Costeja’) in 2014. Although labelled and officially referred to as ‘the Right to be Forgotten’, the principle is not actually so. If a request is successful, the link to the information is removed (‘de-listed’) so as to make it much more difficult to find the information, but the information itself is not removed from the underlying source.
NT1 and NT2
NT1 was involved in a controversial property business in the 1980s and was convicted of conspiracy to evade tax related to that business in the 1990s. He was sentenced to a term of imprisonment. He was accused of – but never tried for – a separate conspiracy that involved deceiving customers. He requested that Google ‘de-list’ search results related to his name that revealed his conviction and the accusation.
NT2 had been involved in a business at the beginning of the decade that was publicly criticized for its environmental practices. Various protestors targeted the business and NT2 received death threats. He authorised an investigations firm to use phone tapping and computer hacking to find the culprits. For this conduct he pleaded guilty to two counts of conspiracy. He served 6 weeks in custody and was released on licence. He requested that Google ‘de-list’ links revealing this information, including interviews he had given after his release.
The Claimants argued that Google’s refusal to de-list the search results contravened six of the data protection principles listed in Schedule 1 of the Data Protection Act 1998 (‘DPA’). In particular, that none of the conditions in Schedule 2 and 3 that would allow Google to lawfully process sensitive data were met, that the personal data was inaccurate, and that it was kept for longer than necessary for the processing purposes.
Google relied on a number of grounds to defend the claim. Initially it argued that its search function is a form of ‘caching’, which is protected under Article 13 of the E-Commerce Directive 2000/31/EC and Regulation 18 of the Electronic Commerce (EC Directive) Regulations 2002 (SI 2002/2013), but it dropped this argument at trial.
It unsuccessfully argued that the claims were an abuse of process, by using data protection law and misuse of private information to circumvent a claim in defamation. But Warby J disagreed, finding that causes of action arise from a set of particular facts and “it would be wrong to draw too sharp a distinction between the protection of reputation on the one hand and private life on the other” .
Confidentiality, Privacy and Criminal convictions
Of importance in this case, is that for the purposes of the Rehabilitation of Offenders Act 1974 (‘ROA’) the Claimants’ sentences were ‘spent’ at the time of trial. The ROA’s aim is to rehabilitate offenders who have not been reconvicted of any serious offences, by protecting them from prejudicial treatment related to their convictions. This is subject to Section 8 ROA, which provides an exception for actions in defamation, and prevents a remedy for reputational harm as a result of communicating the truth of a conviction.
An issue that arose was whether information about a conviction can be confidential information, or information that can be considered ‘private’ for the purpose of Article 8 ECHR. It has been held so far in the courts (and Warby J did not conclude otherwise) that a conviction, whether spent or not, is not confidential . However, whether or not a conviction may become an aspect of an individual’s private life and so within the ambit of Article 8, will depend on the amount of time passed. In this aspect, whether or not it is a spent conviction becomes relevant. It is a “weighty factor” but not determinative, in the balancing exercise applied in deciding upon its disclosure .
Section 70(2) DPA contains a supplementary definition of inaccuracy for the purpose of the Fourth Data Protection Principle, which states “data are inaccurate if they are incorrect or misleading as to any matter of fact”. The Article 29 Data Protection Working Party (‘A29WP’), in its guidelines on the Costeja case, highlights the difference between a search result that relates to opinion and one that contains factual information. Warby J added another source of guidance to his decision on this issue: the domestic law of defamation.
As in a claim for libel, where words cannot be read in isolation from an entire publication, so too could it be legitimate to assess factual statements in context when considering inaccuracy for the purposes of the DPA. A parallel was drawn with defamation principles and the A29WP criterion of not giving an “inadequate or misleading impression of an individual” . Warby J also cited the legitimate need to provide coherence in the law.
Google’s Processing and the Journalism exception
The distinction made in Costeja between processing by an Internet Search Engine (‘ISE’) and other forms of processing was noted. The ubiquitous use of search engines today, and the ability for them to connect vast amounts of information, enables the potential for substantial intrusion in a person’s private life. In Costeja the Court did not find that ISEs were processing information “solely for the purposes of journalism”.
Warby J agreed. Google’s function as a facilitator of communication cannot be equated to that of journalistic publication. Further, the required subjective element of the test in Section 32 DPA is absent. Google does not assess whether content is in the public interest, it operates an automated system of indexing using computer-based algorithms. Warby J concluded that the only processing condition that would apply was condition 5 in Schedule 3: “the information contained in the personal data has been made public as a result of steps deliberately taken by the data subject”. Following Stephen J’s approach in Townsend v Google Inc & Google UK Ltd  NIQB 81, he concluded that in line with the principle of open justice, a claimant’s criminal conduct is a ‘deliberate step’.
Regulation (EU) 2016/679 (also known as the General Data Protection Regulation ‘GDPR’) comes into force on 25 May 2016. It replaces Directive 95/46/EC (the ‘DP Directive’) from which the DPA emanates. Article 17 of the GDPR recognizes the Costeja judgment and incorporates a ‘Right to Erasure’, which can be requested if certain grounds are met. However, Warby J declined to engage with the GDPR stating “this case is being determined in the twilight of the DP Directive regime, with the first light of the GDPR already visible on the horizon. It seems unlikely to me that my decision will have an impact on other cases” .
There was debate about when the ‘Right to be Forgotten’ test ought to be applied in the structure of the decision. Warby J declined to set out a specific juncture, but in essence subsumed the assessment within the consideration of data protection principles. He applied the 13 criteria set out by the A29WP in its guidelines, to NT1 and NT2’s specific situations.
NT1 lost on all aspects of his claim. Warby J was critical of NT1 for failing to particularize or provide evidence of harm caused to him by the information. NT1 was found to be evasive and untrustworthy at trial; much of his evidence was rejected. The fact that NT1 had used his own online postings to try to ‘boost’ his image, that avoided the fact of his conviction and more, was concluded to have promoted an image that was false and misleading. Much of NT1’s concerns about the listings were centered on his business activities, not his private and family life. Since his conviction he had become a lender to various individuals and companies, and Warby J was of the view that he would be likely to provide an inaccurate account of his credentials should de-listing take place.
NT2, by contrast, succeeded. He was found to be a credible witness who acknowledged his guilt and admitted making “a cataclysmic mistake” . NT2 also has a young family and the impact on his family life was deemed stronger than NT1. Warby J concluded that NT2 had no direct personal gain from his crime, which was an invasion of privacy conviction not a dishonesty conviction, and that he posed little risk to future businesses or customers. Even though NT2 had also used the Internet to promote his image after his conviction, the postings had not made inconsistent claims. NT2 had publicly addressed his conviction in interviews.
Although NT2 succeeded he was not awarded compensation or damages, on the basis that Google had taken reasonable care. NT1 appealed against the decision.
 See Elliott v Chief Constable of Wiltshire (The Times, 5 December 1996), L v Law Society  EWCA Civ 811
 See R(L) v Comr of Police for the Metropolis (Secretary of State for the Home Dept intervening)  1 AC 310; R(T) v Chief Constable of Greater Manchester Police  AC 49; Gaughran v Chief Constable for the Police Service of Northern Ireland  AC 345; CG v Facebook Ireland Ltd  EMLR 12; R (P) v Secretary of State for the Home Department  2 Cr App R12.
If you would like any further information or advice, I can be contacted at: firstname.lastname@example.org