In the U.K., and elsewhere in the world, the burning question at the moment is how we move forward in this pandemic, not just from a health perspective, but from an economic perspective as well. We all want to know what the so-called ‘exit strategy’ will be. One of the possible facets of the strategy is the use of mobile phone apps to ‘track’ and to ‘trace’ Covid-19 infected persons and their contacts, which proponents argue will give the public confidence in the lockdown-easing measures and prevent further ‘waves’ of the outbreak.
The apps being proposed use Low-Energy Bluetooth technology as opposed to GPS; this is the first time that such technology has been used for contact tracing in this way. When Covid-19 first appeared, many countries adopted contact tracing to try to prevent an outbreak using a manual process combined with GPS location data, to identify infected persons, track their interactions and find other people they could have infected. However, at this point in the pandemic we are beyond prevention. The use of Covid-19 tracing apps is part of an overall strategy to keep the R transmission number – the effective reproduction rate of the virus – below 1.
The basic idea of Covid-19 tracing apps is that proximity rather than location is observed. The app emits a code, or message, and logs each time it comes into contact with another phone using this app, which also emits a code or message. The phones essentially ‘speak’ to each other in what would look like gibberish to a person. There is no directly identifying information such as a name or location being transmitted between the phones. The app records which devices have been in contact with each other, the strength of the Bluetooth signals (i.e. how close the phones were to each other) and the duration of the signals (i.e. how long the phones were close to each other).
What happens next has led to much debate about the privacy risks and effectiveness of the concept. The main distinction is where the information is stored: either in a decentralised or centralised system. In a decentralised system, an app user (person A) tells the app they are unwell and other users of the app who came into contact with person A are informed. Person A will receive notifications from the public health provider about what to do next, but the public health provider does not know anything else about person A, or about the other users that came into contact with person A. The information is all on the app users’ phones.
In a centralised system, person A reports their symptoms, and those it came into contact with, through the app to the public health provider. A risk assessment is made by the public health provider to decide which of the contacts should be notified. In this system the information is both on the app users’ phones and in a database created by the public health authorities. Clearly there are benefits to a centralised system as the public health authorities will be able to conduct analyses on the data to identify trends. But the privacy risks associated with a centralised system are greater, in particular because it is possible that authorities may later use that data for different purposes (so-called ‘mission creep’).
The U.K. has developed its own centralised tracing app, NHS COVID-19, which it began testing in the Isle of Wight from 4 May 2020; the Data Protection Impact Assessment for the app is available here. The U.K is one of few countries globally that have opted for a centralised system. Privacy International has performed a technical static analysis of NHS COVID-19 and summarised its findings here.
There are a number of problems with the use of Covid-19 tracing apps, which I will summarise in turn.
It is important to understand that every system utilised by Covid-19 tracing apps, centralised or de-centralised, will have a privacy risk associated with its use. The Covid-19 tracing apps all involve the processing of pseudonymised data. Whilst many use the term ‘anonymised’ this is not accurate in legal terms, in Europe at least. Under the GDPR, for personal data to be anonymised, it must not be possible to identify the data subject from the data.
Where personal data is stripped of identification characteristics, it is not anonymised, but pseudonymised, because there is always the possibility that the individual can be re-identified. There are, broadly speaking, two likely possibilities for identification: by a malicious actor who may for example want to either steal health data or disrupt the system, or by the authorities themselves. For an analysis of the types of re-identification that could occur see here.
However, at some point in the utilisation of the Covid-19 tracing app, a person is likely to be identified, unless the unwell person decides not to self-report, or to seek assistance. More likely, the app user will want a test for Covid-19, which they will need to obtain from a public health provider. Even if a code is used in the process of sending the test and result, the code will still be linked to a person’s actual identity somewhere in the system. This will be necessary to ensure that any reporting to the app isn’t fraudulent, and to be able to collect data to make public health decisions and use for research.
Beyond these specific privacy concerns there is a broader issue and that is the introduction of a system of mass proximity surveillance, without any parliamentary debate over the possible long-term societal impact that it will have, and the consequences of the future uses of the data. The public concern is justified given that the U.K. does not have a good track record when it comes to health data in centralised systems and data sharing with the private sector. In 2017, the Information Commissioner’s Office (‘ICO’) investigated the Royal Free NHS Foundation Trust for sharing personal data of 1.6 million patients with Google DeepMind. The ICO concluded that there was no proper legal basis for doing so, and that patients had not been adequately informed.
As with the privacy risks there are also security risks in the use of Covid-19 tracing apps. Malicious actors could try to create false alerts, harass specific individuals, or carry out so-called ‘sybil’ attacks, where a large number of fake identities are created to gain disproportionate influence in the system. They could also attempt to disrupt connection between phones using radio-wave technology (known as ‘radio jamming’). There could also be attempts to access users’ phones through the open Blue-tooth channel, but this risk exists with any use of Bluetooth, for example when using wireless earphones.
More concerning is the possibility of a security breach that leads to the loss of the personal data at the ‘back end’ of the system. This is in essence the database where the identifying information is kept after it is reported to a centralised system. Security breaches occur in many different circumstances, not just where a malicious actor tries to hack into a system. One of the Covid-19 tracing apps being developed has already experienced a data breach. Last month, Belgian ‘Covid-19 Alert!’ leaked 200 names, email addresses and encrypted passwords, after a mistake was made when making the source code available online.
- Technical issues
Apple and Google’s operating systems do not allow Bluetooth technology to run constantly in the background if the phone is locked. The companies have together developed application programming interfaces (‘API’) and operating system-level technology to permit Covid-19 tracing apps to function on their phones and overcome this issue. It will also allow different smart phones to communicate with each other. The combined initiative does not permit centralised Covid-19 tracing apps and will restrict such apps so that they cannot access location services.
Privacy International has performed a technical static analysis of the NHS’s Covid-19 tracing app and found that it only works on particular types of modern smartphone, so that anyone in the U.K. with an old or less expensive model will not be able to use it. Furthermore, it is not possible to use any other app from the phone at the same time as running the NHS Covid-19 tracing app. For people who need to use another app for work purposes – such as taxi drivers, food delivery persons, postal delivery workers etc. – it will not be possible to keep the tracing app open on their phones unless they are provided with a separate device for work.
It is not clear if Covid-19 tracing apps will be effective in the context in which they are being used. Estimates of the percentage of the population needed to download and use the app to have any impact range from between 60% and 80%. Bluetooth can also work through glass and through some walls, meaning that if you are living in an apartment block, or working in an office building, it is possible that your phone will connect with an infected user with whom you may never have come into close enough proximity to contract the virus.
The assessment parameters for the likelihood of transmission of the virus will need to be examined. There may be scenarios where there is a higher likelihood of transmission than the Covid-19 tracing app will detect. For example, the app may trigger a notification for a person who has spent 30 minutes sitting two metres away from someone who is infected but wearing a mask in a café, but not an infected person who kisses another person for a minute and then leaves. Likewise, it may not trigger a notification where two people are walking down the street, one behind the other and 5 metres apart, but where the person ahead sneezes and crosses the road, after unfortunately infecting the person behind through the ‘aerosol’ effect.
This type of contact tracing has never been done before and to some extent we will not know how effective it can be until it has been put into use for a period of time, so that the impact can be observed. An initial modelling study that simulated the impact of a range of different government measures – testing, isolation, tracing and physical distancing – found that the use of contact tracing made a significant impact on reducing the R number than testing alone.
- Public Policy
There are other issues outside of identification of individuals that are a cause of concern in the use of these apps. The government policies that will accompany the use of the app has not been made clear. For example, will it be used to prevent outbreaks? Or will it be used to make sure that individuals are abiding by quarantine rules? Will it contribute to some sort of “immunity certificate”? The U.K. government has said that the use of the app will not be made mandatory, which would be at odds with the recommendations of the ICO and the European Data Protection Board (‘EDPB’).
Another important question is how will companies respond to Covid-19 tracing apps? Employers might be tempted to insist that employees download the apps as part of their health and safety measures, and to protect themselves from liability. It will be difficult for employers to justify making it a mandatory requirement if the government has made use of the tracing app voluntary. For certain sectors where services are provided to people in close proximity, such as gyms, restaurants and transportation, they might insist that the Covid-19 tracing apps are used to provide their services. The EDPB has stated in its general legal analysis of the development of these tracing apps that people who do not want to use them, or are unable to use them, should not be disadvantaged.
The deployment of Covid-19 tracing apps will introduce widespread monitoring of contact between people on a scale never seen before. How this pandemic will unfold and what role Covid-19 tracing apps will play in the outcome is as yet uncertain.
If you would like any further information or advice, I can be contacted at: email@example.com
 An R of 1 means that every 1 person infected infects one other person, thereby doubling the number of people infected. An R number of less than 1 means the spread of the infection is decreasing.
 Although some of the apps, as an additional feature, allow location to be recorded.
 France, Japan and Australia are also developing apps that are based on a centralised system.
 See GDPR Recital 26.
 For example, a malicious attempt to disrupt the system, or to obtain a Covid-19 test to sell.
 The UK Joint Committee on Human Rights discusses the issues in its report ‘Human Rights and the Government’s Response to Covid-19: Digital Contact Tracing’ 6 May 2020: https://publications.parliament.uk/pa/jt5801/jtselect/jtrights/343/343.pdf
 Guidelines 04/2020 on the use of location data and contact tracing tools in the context of the COVID-19 outbreak, adopted on 21 April 2020: https://edpb.europa.eu/sites/edpb/files/files/file1/edpb_guidelines_20200420_contact_tracing_covid_with_annex_en.pdf