R (Bridges) v The Chief Constable of South Wales Police [2020] EWCA Civ 1058

This is a review of the case: R (on the application of Edward Bridges) v the Chief Constable of South Wales Police [2020] EWCA Civ 1058. It was before the Master of the Rolls Sir Terence Etherton, the President of the Queen’s Bench Division Dame Victoria Sharp, and Lord Justice Singh, in the Court of Appeal. It was a unanimous decision.

The Appellant was Edward Bridges, a civil liberties campaigner. The Respondent was the Chief Constable of the South Wales Police (‘South Wales Police’).

The Secretary of State for the Home Department was an interested party. The department is responsible for policing nationwide and gave funding to the South Wales Police to develop automated facial recognition (‘AFR’) technology. There were also two interveners: the Information Commissioner and the Surveillance Camera Commissioner.

The Facts

The South Wales Police is the lead police force in the United Kingdom on testing and conducting trials on AFR. It has a licence to use proprietary AFR software called ‘NeoFace Watch’, which was developed by North Gate Public Services (UK) Ltd. 

The AFR software works in stages. Using a database of existing facial images, a ‘watchlist’ is created. These are the faces that the system operator wants to find in a crowd. The images are processed so that particular features are extracted and assigned numerical values to create a unique biometric template. Next, a CCTV camera takes digital pictures of facial images in real time. The software detects human faces from the live feed and isolates individual faces. It then automatically extracts unique facial features from those images and creates a unique biometric template for each face. The software will compare the ‘new’ templates made from those created from the live feed, against the existing templates in the watchlist. During the comparison, the software generates a ‘similarity score’, which is a numerical value indicating the likelihood that the faces match. A threshold value is fixed to determine when a match has occurred. 

The South Wales Police used the software around 50 times between May 2017 and April 2019, mostly at large public events. It refers to this use as ‘AFR Locate’. CCTV cameras were mounted on police vehicles, poles or posts. To inform the public about the use of AFR Locate, the South Wales Police put messages on Facebook and Twitter, displayed large notices on the police vehicles, handed out postcard notices to members of the public and put information on its website.

The watchlists were created from images held on the South Wales Police database, mostly custody photographs. The purpose of the watchlists was to find persons wanted on warrants, those who have escaped from custody, criminal suspects, missing persons or people in need of protection, persons of interest for intelligence purposes and individuals whose presence at a particular event might cause concern. If during the deployment of AFR Locate, the software identifies a possible match, the two images are reviewed by a police officer, and action taken if necessary. The system can scan 50 faces per second and the total number it can scan is unlimited. 

If no match is made when comparing two images, the live scanned image and its associated biometric template is immediately and automatically deleted. The CCTV feed is deleted after 31 days. If a possible match is alerted, the image is deleted either immediately or within 24 hours and the match report is deleted after 31 days. Newly created watchlist images and the associated biometric templates are deleted either immediately, or within 24 hours of deployment.

Mr Bridges was contesting two occasions when AFR Locate was used when he was present: on 21 December 2017 on a busy shopping street in Cardiff, and on 27 March 2018 at a Defence Exhibition. At the first occasion, AFR Locate had used three watchlists, in all comprising some 900 persons. There were 10 possible matches, two of which were assessed not to be true matches. Two arrests were made.

At the second occasion, three watchlists were used to seek out suspects and persons wanted on warrants. One of the watchlists comprised a list of persons who had been arrested at the same event the previous year, including one who had made a false report of a suspected bomb. That person was identified by AFR Locate but no action was taken. No arrests were made.

Even though both these events occurred whilst the Data Protection Act 1998 (‘DPA 1998’) was in effect, the parties requested that the court should consider them under the Data Protection Act 2018 (‘DPA 2018’).

The Grounds of Appeal

There were five grounds of appeal. The appellant argued that the Divisional Court had erred in making the following conclusions:

(1) That the use of AFR technology was in accordance with the law for the purposes of Article 8(2) of the European Convention on Human Rights (‘the Convention’).

(2) That the use of AFR technology was a proportionate interference with Article 8 of the Convention, in particular in failing to consider the cumulative interference with the Article 8 Convention rights of all the people whose facial biometrics were captured during the deployments.

(3) That the Data Protection Impact Assessment (‘DPIA’) for the use of AFR technology complied with section 64 DPA 2018. The appellant argued that the DPIA failed to take into account Article 8 of the Convention, and the processing of biometric data of those persons not on the police watchlists.

(4) That it declined to conclude whether the South Wales Police force has in place an ‘appropriate policy document’ within the meaning of section 42 DPA 2018, taken with section 35(5) DPA 2018. To comply with the first data protection principle – lawful and fair processing – contained within section 35 DPA 2018, the controller must first have ‘an appropriate policy document’ in place.

(5) That it held that the Respondent had complied with the Public Sector Equality Duty (‘PSED’) in section 149 of the Equality Act 2010 in circumstances in which the Respondent’s ‘Equality Impact Assessment’ was obviously inadequate because it failed to recognise the risk of indirect discrimination. The Court had also failed to appreciate that the PSED is a continuing duty.

The Judgment

Ground 1: ‘In accordance with the law’ for Article 8 of the Convention

The underlying question under this ground is whether there is a sufficient legal framework for the use of AFR locate. The general principles to evaluate the ‘in accordance with the law’ standard are well established (Lord Sumption in R(Catt) v Association of Chief Police Officers [2015] UKSC 9 and in Re Gallagher [2019] 2 WLR 509) and can be summarised as follows:

  • The measure in question must have some basis in domestic law and must be compatible with the rule of law, which means it should comply with the requirements of ‘accessibility’ and ‘foreseeability’.[1]
  • To be accessible, the legal basis must be published and comprehensible, and it must be possible to find out what the provisions are.[2]
  • To be foreseeable it must be possible for a person to foresee its consequences. It should not confer a discretion so broad that in practice, its scope depends on the will of those that apply it, rather than the law itself.[3]
  • The law must clarify the scope of the discretion, and the manner of its exercise by competent authorities to guard against arbitrariness.[4]
  • Where the measure is a discretionary power, it need not be over-rigid in its application, but there must be safeguards to guard against ‘overbroad discretion resulting in arbitrary, and thus disproportionate, interference with Convention rights’.[5]
  • The rules governing the scope and application of the measure need not be statutory, provided that they operate within a framework of law with an effective means of enforcing them.[6]

The Divisional Court found that there was a clear and sufficient legal framework governing AFR Locate found in the provisions of the DPA 1998 and 2018, the Surveillance Camera Code, and the South Wales Police force’s own policy documents. Those policies, according to the Divisional Court, could be ‘altered and improved’ over the course of the trial of AFR Locate, and this did not indicate that the policy documents were deficient. The Court of Appeal disagreed: “We find the references by the Court to the possibility of future reconsideration of this issue a little curious. This is because either an interference is in accordance with the law or it is not…The fact that this case involved the trial of a new technology does not alter the need for any interference with Article 8 rights to be in accordance with the law”.[7]

The Court of Appeal highlighted the significance of the fact that AFR is a novel technology and that it involves the collection of ‘sensitive’ personal data within the meaning of the DPA 2018; it goes beyond the taking of a photograph or the use of CCTV. Whilst stating that a ‘large part’ of the Divisional Court’s analysis was correct, the judgment pointed to some ‘fundamental deficiencies’ described as the “who question” and the “where question”.[8] The Court of Appeal was of the view that there was too much discretion left to police officers to determine who was placed on the watchlist, and that there were no clear criteria on where AFR Locate could be deployed.

The DPA 2018 was concluded to provide an ‘important part of the framework in determining whether the interference with the Appellant’s Article 8 rights was in accordance with the law’, but is not by itself sufficient.[9] The Information Commissioner was of the view that the processing was not based on law within the meaning of section 35(2) DPA 2018, interpreted in accordance with the European Union’s Law Enforcement Directive and the Convention, because there was no legal basis for its application that was clear, precise and foreseeable. However, as this particular point was not raised by the Appellant, the Court of Appeal did not make any conclusions on this issue.

The Court of Appeal noted the inclusion of facial recognition within parts of the ‘Surveillance Camera Code of Practice’ that was issued by the Secretary of State for the Home Department in June 2013. It was of the view that the problematic ‘who’ and ‘where’ questions could in principle be dealt with in a revised Code. 

On the matter of the South Wales Police force’s local policies, the Court of Appeal evaluated the ‘privacy impact assessment’ under the DPA 1998, the DPIA under the DPA 2018 and its ‘Standard Operating Procedure’. All were found to be lacking in relation to the questions of who could be placed on a watchlist and where the deployment of AFR Locate could take place. The appeal was therefore allowed on this ground.

Ground 2: Proportionate interference

Whether an interference with Article 8(1) of the Convention is proportionate is determined by the four-part test in Bank Mellat v Her Majesty’s Treasury (No 2) [2014] AC 700:

  • Whether the objective of the measure pursued is sufficiently important to justify the limitation of a fundamental right;
  • Whether it is rationally connected to the objective;
  • Whether a less intrusive measure could have been used without unacceptably compromising the objective; and
  • Whether, having regard to these matters and to the severity of the consequences, a fair balance has been struck between the rights of the individual and the interests of the community.

The Appellant argued that the Divisional Court had been wrong in its conclusions on the fourth question on two aspects. First, the ‘benefit’ part of the proportionality balance must take into account both ‘actual’ results (in this case, the number of arrests) and its ‘anticipated’ benefits. Second, that the ‘cost’ part of the proportionality balance does not only take into account the particular appellant, but all the members of the public whose Article 8 Convention rights have been interfered with by deployment of AFR Locate. This argument was rejected by the Court of Appeal on the basis that the interference with another member of the public was analogous to the situation of the Appellant and in any event, there was no such cumulative effect such that the impact on Article 8 becomes ‘weightier’.[10]

Ground 3: Compliance with section 64 DPA 2018

Three criticisms were made of the DPIA:

  • It contained no recognition that AFR technology entails the processing of the personal data of persons not on the watchlist;
  • It did not acknowledge that the Article 8 Convention rights of those persons are engaged; and
  • It did not engage with the issue of the risks to other rights that are likely to be affected by the use of AFR technology, such as the right to freedom of assembly under Article 11 of the Convention and to freedom of expression under Article 10 of the Convention.

The Information Commissioner too criticised the DPIA, highlighting that it had failed to acknowledge that it involves the collection of data ‘on a blanket and indiscriminate basis’ and had not addressed the potential for gender and racial bias, or considered the effects of a false positive match.[11] The Court of Appeal found that some of these criticisms were unjustified, in that the DPIA had acknowledged that AFR technology ‘might be perceived as being privacy intrusive in the use of biometrics and facial recognition and that Article 8 of the Convention was relevant’.[12] However, it agreed that the DPIA had proceeded on the basis that Article 8 of the Convention was not infringed, and given the Court’s conclusions on whether its use was ‘in accordance with the law’ to satisfy the requirements of Article 8(2) of the Convention, the DPIA did not comply with section 64 DPA 2018. 

At paragraph 153, the Court of Appeal concluded: ‘notwithstanding the attempt of the DPIA to grapple with the Article 8 issues, the DPIA failed to properly assess the risks to the rights and freedoms of data subjects and failed to address the measures envisaged to address the risks arising from the deficiencies we have found, as required by section 64(3)(b) and (c) of the DPIA 2018’. 

Ground 4: Compliance with section 42 DPA 2018

Section 42 DPA 2018 sets out what ‘an appropriate policy document’ should contain for the purposes of processing sensitive data for law enforcement purposes under section 35(5) DPA 2018. The Divisional Court had left open to question whether the South Wales Police ‘November 2018 Policy Document’ fully met the standard required on the basis that the Information Commissioner had provided further guidance. The Court of Appeal rejected this ground on the basis that the two events under consideration (in December 2017 and March 2018) took place when the DPA 1998 applied, not the DPA 2018. 

Ground 5: The Public Sector Equality Duty ‘PSED’

The terms of the PSED are found in section 149(1) of the Equality Act 2010. In this case, the relevant protected characteristics are race and sex. The Appellant criticised the failure of the South Wales Police to have regard to the scientific evidence that bias can be found in facial recognition software such that there is a greater risk of false identifications for people from Black, Asian and other Minority Ethnic (‘BAME’) backgrounds, and for women. Further, it was argued that the PSED is an ongoing obligation, and that the Respondent was in continuing breach of it. 

The Court of Appeal was of the view that this point is ‘a serious issue of public concern, which ought to be considered properly by [the Respondent]’.[13] It noted the relevant legal principles in relation to the PSED were set out by McCombe LJ in R(Bracking) v Secretary of State for Work and Pensions [2013] EWCA Civ 1345  at paragraph [60]:

  • The PSED must be fulfilled before and at the time when a particular policy is being considered;
  • The duty must be exercised in substance, with rigour, and with an open mind. It is not a question of ticking boxes;
  • The duty is non-delegable;
  • The duty is a continuing one;
  • If the relevant material is not available, there will be a duty to acquire it and this will frequently mean that some further consultation with appropriate groups is required; and
  • Provided the court is satisfied that there has been a rigorous consideration of the duty, so that there is a proper appreciation of the potential impact of the decision on equality objectives and the desirability of promoting them, then it is for the decision maker to decide how much weight should be given to the various factors informing the decision. 

The Court of Appeal agreed that the PSED is a ‘duty of process and not outcome’ and ‘helps to reassure members of the public, whatever their race or sex, that their interests have been properly taken into account before policies are formulated or brought into effect’.[14] A public authority is required to proactively consider whether to alter its practices and structures in order to meet its statutory duty to promote equality and eliminate discrimination.[15] This positive duty requires a public authority not to inadvertently overlook information that it should take into account. The Court was critical of the failure of the South Wales Police to independently verify that the NeoFace Watch software algorithm is not biased towards a particular group and said that the fact that AFR Locate was in its trial stage did not change the requirement to discharge its duty. 

Case Comment

The Court of Appeal’s final comment in its judgment was that it ‘hopes’ that any police force wanting to use this novel technology in the future, would first satisfy itself that the software does not contain racial or gender bias. The human ‘failsafe’, that is, the fact that a person checks the result before acting upon it, was deemed to be insufficient. Further, not only are false positive matches a problem. If bias already exists within the software, there is also no way of knowing how many false negative results exist, with or without human intervention at the output stage. However, it will be difficult to independently assess the issue of bias, because it requires the police force (or relevant public authority) to have access to the datasets that were used to train the AFR system. The developers of AFR software will consider such data, and the algorithms applied, to be commercially sensitive information and they may be reluctant to reveal it.

The judgment also illustrates the level of detail and engagement with privacy issues and Article 8 of the Convention that is required in DPIAs. This is relevant not only for public authorities but also for organisations in the private sector, and requires a detailed assessment on the intrusion on privacy, whether it is necessary, and if it is proportionate in the context of the processing. The Information Commissioner’s Office has published guidance on the use of artificial intelligence and data protection.

I recently reviewed the use of facial recognition technology, and the current legal challenges across the world in an article that will be released next month in Communications Law Journal.[16]

My book on biometric data and new technologies considers the law in the UK (and to some extent Europe), best practice and recent cases. Available on Amazon: here.

If you would like any further information or advice, I can be contacted at: privacylawbarrister@proton.me

[1] Sunday Times v United Kingdom (1979) 2 EHRR 245; Silver v United Kingdom (1983) 5 EHRR 347; Malone v United Kingdom (1984) 7 EHRR 14.

[2] Lord Sumption, Re Gallagher [2019] 2 WLR 509 at [17].

[3] Ibid.

[4] S v United Kingdom, ibid, at [95] and [99].

[5] Lord Hughes in Beghal v Director of Public Prosecutions [2016] AC 88’ at [31] and [32].

[6] Lord Sumption in R(Catt) v Association of Chief Police Officers [2015] UKSC 9 at [11].

[7] Paragraph 58 of the judgment.

[8] Paragraphs 85 to 91.

[9] Paragraph 104.

[10] Paragraph 143.

[11] Paragraphs 148 and 149.

[12] Paragraph 151.

[13] Paragraph 172.

[14] Paragraph 176. 

[15] Paragraph 177 to 178.

[16] ‘Facial Recognition and Detection Technology: Developments and Challenges’, Communications Law Journal (2020) Vol 25, No 3.