A recent case in the Court of Justice of the European Union (‘CJEU’) has provided guidance on the interpretation of the automated decision-making provisions in the General Data Protection Regulation (‘GDPR’). This ruling is not binding on any UK court but does provide the likely arguments and reasoning that would be applicable when looking at the relationship between data protection and behavioural scores created by algorithms.
SCHUFA Holding AG is a private German company that provides credit scoring to banks and financial institutions. Credit scoring involves using mathematical calculations on a person’s data to produce a probability rating or ranking. This rating is used by financial institutions to make decisions on loans and other financial matters. Credit scoring assumes that by assigning a person to a group of other persons who have comparable characteristics, that those persons will behave in a similar way, and that future behaviour can be predicted.
In this case the applicant asked SCHUFA to give her information on how her scoring was calculated, including what data was used and the weighting of the data in the calculation. SCHUFA refused, on the basis that the information was not disclosable because it is their intellectual property and a trade secret. They also argued that it was not them that made the ‘decision’, but the financial institutions, and therefore the data protection obligations did not apply to them.
The Relevant Provisions in the GDPR
Under Article 22(1), a data subject has “the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.” Recital 71 specifically describes credit applications as relevant to the right to object to profiling.
Article 15(1) gives a data subject the right to obtain information from a controller on the personal data that is being processed, and under subsection (h) to be informed of the existence of automated decision-making, including profiling. This includes “meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing.”
The Ruling
The CJEU was asked whether Article 22(1) applies to a ‘probability value’ on the ability of a data subject to service a loan in the future, where the value is sent to a third-party data controller, who then considers that value to decide on a contract.
In Article 22(1), there are three cumulative conditions[1]:
- There must be a ‘decision’;
- The decision must be based ‘solely on automated processing’;
- It must produce ‘legal effects’ or ‘similarly significant’ effects.
The concept of ‘decision’ is not defined in the GDPR but was found by the court – given the wording of the provision and Recital 71 – to have broad scope. The referring court and the first chamber were of the view that the financial institutions that receive the credit scoring ‘draw strongly’ on it such that it has a ‘determining role’ in the final decision on lending. It was noted that the referring court had made a factual finding that a low credit scoring leads to a refusal of a loan in almost every case. The establishment of the scoring value was found to itself be a ‘decision’ that produces a legal effect on the data subject within the meaning of Article 22(1).[2]
The court concluded that if SCHUFA were to succeed in its argument that the credit score was just a preparatory act and not a decision, it would circumvent the intended protection to the data subject given in Article 22(1) and create a lacuna in the law. For example, the financial institution would not be able to provide the data subject with the information behind the algorithm under Article 15(1)(h) because it would not have that information, despite relying on it to make a decision that impacts the data subject.[3] The data subject would not be able to examine the accuracy of his or her personal data used to make a significant decision that impacts them.
The Advocate General, in his Opinion, considered the conflict with the right of the credit agency to protect its intellectual property, acknowledging SCHUFA’s objection as a valid issue. He concluded the right to protect trade secrets however does not preclude the provision of some information to a data subject ‘so as not to compromise the essence of the right to protection of personal data’ and could not be used to justify absolute refusal.[4] An algorithm itself does not need to be disclosed, but it is expected that ‘sufficiently detailed explanations of the method used to calculate the score and the reasons for a certain result’ be provided.[5]In this case, this meant disclosure of the factors considered for the credit score, including the respective aggregate weighting of the factors, and the reasons for the final value.[6]
The Court also highlighted recital 71 and the obligation on a controller to implement appropriate procedures and measures to protect a data subject from the potential risks involved in the processing of personal data. This could include mathematical or statistical procedures, and other measures, to ensure errors and inaccuracies are minimised.[7]
Case Comment
Given the obligations in the GDPR of fairness, transparency, and accuracy, and the additional measures in Article 22(1) and Recital 71 the conclusion of the court is not surprising. For public bodies using algorithms to make decisions, there will also be other legal issues to consider in addition to data protection compliance, such as Article 6 of the European Convention on Human Rights (the right to a fair trial), and judicial review principles. Depending on the context, private sector organisations may also need to consider consumer fairness obligations, financial regulations and discrimination laws.
This judgment involved a clear link between the algorithmic output and the decision being made, albeit by a third party. Difficulty is likely to arise however in circumstances where the algorithm is used to create a scoring that does not have such a clear influence on a subsequent decision, or where it is used in conjunction with other algorithmic outputs.
If you are interested in any further information or advice, please contact my clerks on 0203 179 2023 or privacylawbarrister@proton.me.
[1] Paragraph 43 of the ruling.
[2] Paragraphs 48 to 50 of the ruling.
[3] Paragraphs 61 to 63 of the ruling.
[4] Paragraph 56 of the Opinion.
[5] Paragraph 57 of the Opinion.
[6] Ibid.
[7] Paragraph 66 of the judgment.