The GDPR’s ‘Data Protection By Design and By Default’

The General Data Protection Regulation 2016/679 (‘GDPR’) has introduced the concept of ‘Data Protection by Design and by Default’ (‘DPbDD’) into the data protection framework.[1] ‘Privacy by Design’ is not a new concept; it was formulated by the Information and Privacy Commissioner of Ontario in the 1990’s and is based on seven foundational principles[2].

Whilst Article 25 reflects these principles, the European Data Protection Board (‘EDPB’) distinguishes the concept of ‘Privacy by Design’ from DPbDD. The former ‘encompasses an ethical dimension consistent with the principles and values of the EU Charter of Fundamental Rights’, whilst Article 25 has ‘specific legal obligations’[3]. ‘Privacy by Design’ was viewed as recommended good practice, DPbDD is now a legal requirement.

DPbDD applies to the implementation of all of the data protection principles i.e. lawfulness, transparency and fairness, purpose limitation, data minimisation, accuracy, storage limitation, integrity and confidentiality, and accountability.[4] Sanctions for failing to implement DPbDD may therefore come hand in hand with a breach of one of the data protection principles.

In October 2019, Greece’s Data Protection Authority fined Hellenic Telecommunications Organization (‘OTE’) after human error meant that recipients of advertising messages from OTE could not unsubscribe, and for further internal procedural failings. OTE was fined not only for breaching the principles of accuracy and the right to object, but also for failing to implement DPbDD effectively in relation to these principles.

It is easiest to consider DPbDD in its two separate aspects: ‘by Design’ and ‘by Default’.

‘By Design’

Article 25 sets out the elements that need to be considered to determine what measures (technical and organisational) should be applied to data processing operations to ensure that Data Subjects’ rights and freedoms are protected. Organisations need to consider the ‘nature, scope, context and purpose’ of their data processing and the risks to data protection that arise. The entire processing lifecycle needs to be assessed: collection, storage, use, outsourcing, development, maintenance, testing, storage, deletion etc.

In assessing the risks, it is helpful to start by identifying which Data Subjects come within the remit of the organisation’s data processing activities. Other than the individuals who are the first instance customers/users/clients, are there any other people who need to be included? It is also important to note that some groups of individuals may be more susceptible to privacy violations, for example, children.

The next question to ask is who might violate – either intentionally or unintentionally – these identified Data Subjects’ privacy? For example: cyber criminals, social media users, data aggregation companies, other individuals. The organisation itself should be included in this list, and consider the potential scenarios for violations to occur. The likely capabilities of each of these potential violators will need to be considered in order to be able to design suitable data protection measures.

The measures applied to safeguard against these risks need to be both appropriate and effective. Privacy enhancing technologies (PETs) are a variety of software or hardware techniques that are designed to minimise the privacy risk to individuals, and are usually applied as part of the privacy by design process. Examples are pseudonymization, obfuscation, and differential privacy[5].

In addition to technological measures, organisational measures will also need to be considered. An organisation must look beyond the specific process/product that they are designing and consider where, within their own structure, risks might occur, such as through employee devices, audit trails and other systems or processes.

The EDPB sets out some key design elements in relation to each of the principles in its ‘Guidelines 4/2019 on Article 25 Data Protection by Design and by Default’ (click here to access).

‘By Default’

This is the second aspect of DPbDD, but technology designers will need to consider it at the outset, during the design process. The default position should be that only personal data that is necessary for its specific purpose is processed. Data protection by default applies to: the amount of personal data collected, the extent of the processing of personal data, the amount of time the data is stored for, and the accessibility of the personal data.

In the U.K., section 57 (5) of the Data Protection Act 2018 specifically states that Data Controllers: “must ensure that, by default, personal data is not made accessible to an indefinite number of people without an individual’s intervention”. By default, there must be a limit on who can access the data (including within the organisation itself) that is compatible with Data Subjects’ rights. A Data Subject must not be presented with options in a way that makes it difficult to abstain from sharing their data, or that makes it difficult for them to limit the amount of their personal data that is processed.

On a practical level, DPbDD is not always an easy concept, and requires close collaboration between the designers and engineers with data protection compliance specialists. Smaller organisations may have nothing to do with the DPbDD process if, for example, they have bought a software application, but should make sure that they understand how its data protection measures work and apply to their individual businesses. The EDPB has outlined that if such software has functions that do not provide the necessary privacy protection, these functions must be turned off before engaging with customers/users/clients.

Organisations should also be aware of issues that arise from data aggregation or data scraping by other organisations and will need to look at the wider impact on Data Subjects. The EDPB makes clear that it is the responsibility of the initial Data Controller to prevent the personal data from becoming ‘unduly accessible in the first place’ even though subsequent recipient data controllers are still accountable under data protection laws. It gives the use of a ‘no-robot-textfile’ as an example of a method that can be used to prevent search engines crawling a webpage to index data.

There is a continuing obligation to maintain DPbDD, which will require regular reviews and assessments of not just the technical and organisational measures implemented, but of their effectiveness as well. Data Controllers need to be transparent about how they assess and demonstrate effective DPbDD in the same way that they demonstrate GDPR compliance.

If you are interested in any further information or advice, please contact my clerks on 020 3179 2023 or privacylawbarrister@proton.me

[1] See Article 25 of the GDPR and Part 3, Chapter 4, Section 57 of the Data Protection Act 2018

[2] https://www.ipc.on.ca/wp-content/uploads/resources/7foundationalprinciples.pdf

[3] Individual’s Rights are outlined in Articles 12 to 22, Individual’s Freedoms are found in Recital 4.

[4] See GDPR Article 5.

[5] The European Union Agency for Cybersecurity has a number of publications and resources on PET’s: https://www.enisa.europa.eu/topics/data-protection/privacy-enhancing-technologies