R (Bridges) v Chief Constable of South Wales Police and Others [2019] EWHC 2341 (Admin)
To read the judgment click here.
The High Court turned down an application for judicial review of the decision by South Wales Police to use automated facial recognition technology in a pilot project. The project involved the capture of digital images of the public and comparing them with biometric data from the images on a database of ‘persons of interest’. If there was no match, the system immediately and automatically deleted the image.
The Court held that although the use of the technology interfered with individuals’ Article 8 ECHR right (the right to privacy), the interference was justified, and so no infringement had occurred.
The main points made in the judgment are:
- (Unsurprisingly) the operation of the technology involved the processing of sensitive personal data;
- It is necessary to undertake a data protection impact assessment (DPIA) before using facial recognition technology and ensuring all steps are taken to mitigate any high risk to the individuals;
- The use of facial recognition technology must be used transparently and the public/individuals affected must be informed;
- Data minimisation is important (here, the data was deleted quickly); and,
- A narrow scope for purpose limitation is more favourable in the use of such technology.
Facts that persuaded the Court that the interference was justified included that there was a clear legal framework that governed the use of facial recognition technology by the police force, including codes of practice and police policies. The project had been transparent with significant public engagement. And further, there was human intervention in the way the system worked so that identification was always reviewed by a police officer.
In respect of the DPIA, the court set out (in paragraphs 145 to 148 of the judgment) the approach the courts will take if a data controller is accused of not fulfilling its obligation under section 64 of the Data Protection Act 2018. To assess whether or not a data controller has discharged its obligation, the court will look at what is required to comply. The data controller will be expected to “exercise reasonable judgement based on reasonable enquiry and consideration” but the court will not “necessarily substitute its own view for that of the data controller on all matters”. The court will also look to any guidance issued by the ICO.
Whilst this judgment applies to a public organisation rather than a private organisation there are nevertheless factors that can be incorporated into the approach in the use of such technology by the private sector.
The particular concern of the ICO is for a detailed framework for safeguards to be in place prior to the implementation of any live facial recognition technology. In relation to this case the ICO made a number of recommendations to the police force (these could equally apply to the private sector):
- To carry out a DPIA for each deployment of the technology;
- Organisations are advised to submit the DPIA to the ICO for consideration;
- To produce an individualised ‘appropriate policy document’ to cover the deployment that outlines why, where, when and how the technology is being used;
- To ensure that algorithms within the software do not discriminate against individuals unfairly.
The ICO published an opinion in relation to the use of live facial recognition technology by law enforcement: click here.
The European Data Protection Supervisor published a blog article on this topic last month and is sceptical about the use of facial recognition technology, doubting whether it can comply with the principles of data minimisation and data protection by design and default. To read the article click here.
If you would like any further information or advice, I can be contacted at: privacylawbarrister@proton.meIf you are interested in any further information or advice, please contact my clerks on 020 3179 2023 or privacylawbarrister@proton.me