Contact Tracing is a Privacy Nightmare
The risks of gathering huge sets of data on our population cannot be overstated. The benefits, meanwhile, are shaky at best.
In June, the Centers for Disease Control told Congress that the United States needs as many as 100,000 contact tracers to fight the COVID-19 pandemic. Numerous U.S. states and national governments have been turning to digital solutions, including smartphone-based applications for digital contact tracing. While this has been widely promoted as an important digital tool to curb the spread of the virus, evidence of its effectiveness is hard to find. Meanwhile, human rights groups warn of huge privacy risks.
In the rush for solutions, the implications of digital contact tracing have not been adequately considered. In soon-to-be published research for the Mercatus Center at George Mason University, we investigate the consequences of collecting, collating and storing people’s contact-tracing data, and find that it poses significant risks to democratic stability.
During the pandemic, digital contact tracing has most commonly involved Global Positioning System (GPS) or Bluetooth beacon signals to collect data to analyze who a person has been in proximity with. There have been significant debates across jurisdictions regarding the best technical design and accompanying implementation.
This includes whether participation is mandatory versus voluntary, whether the data is uploaded and stored in a centralized server as a “single point of failure” versus remaining decentralized on people’s phones to be destroyed as soon as possible, whether the software code is open-source for members of the public to audit and verify versus closed to private viewing, and whether phone applications and infrastructure, such as data servers, should be issued by the government versus by the private sector.
Simply put, the benefits of digital contact tracing are not worth the costs to privacy. Large data sets reveal incredible amounts of information. When this data is created on a population, the potential consequences are severe, including hacking, the de-anonymization of individuals, and mass exploitation by other countries for strategic gain. “Information warfare” through social media or other online practices, including manipulation, misinformation, disinformation, and propaganda, is viewed by governments as a national threat in the digital age.
According to the International Federation of the Red Cross, the unsuitable design or usage of such apps could lead to a number of harmful outcomes for certain populations, including stigmatization, increased vulnerability and fragility, discrimination, persecution, and attacks on their physical and psychological integrity.
When influential individuals or specific social groups are de-anonymized, planned attacks can be coordinated to polarize and fragment whole societies. The threat of information warfare can become a threat against democracy through election disruption and the instability that occurs among democraticprocesses and democratic institutions in a crisis.
The early contact-tracing applications have not been adequately designed against these threats. Going forward, policy responses to the crisis must do a better job of measuring the risks.
In Norway, for example, contact-tracing data has been used to identify members of the government and military. Human rights groups have condemned its reckless approach to digital data collection. Germany, meanwhile, planned to create a central data server of digital contact-tracing information, and formal documentation states that insider attacks by administrators and state-level adversaries are “outside of scope” for security in the design of the system. Yet, a privacy and security analysis found that the data protection and security architecture violated European Union “data minimization” principles by enabling access to more information than required. That means administrators or hackers could persistently trace individuals. Public outcry made Germany abandon the initial, centralized approach.
Some might be willing to bear the privacy costs if digital contact-tracing applications were making an effective contribution to slowing the spread of COVID-19—but there is no clear evidence that this is the case. For example, Australia’s “COVIDSafe” app has been labelled a $2 million failure, as it has not helped to identify additional contacts, despite being downloaded by 25 percent of a population with around 27,000 cases which (in certain states) is subject to some of the most severe restrictions in the world.
If data must be collected, legal guidelines should include clear and transparent procedures on how it is acquired, processed, stored, accessed and deleted. Digital privacy, however, depends on more than legal accountability measures—it requires technical considerations for “privacy-by-design” that protects the interests of civil society, shares the burden of responsibility between government and populations in crisis, and builds public trust.
Privacy guarantees of this type are crucial to the design and implementation of digital contact tracing. Privacy-by-design offers a framework to consider throughout the entire engineering process and guide system design towards more transparent and accountable outcomes for users. Practically, this could mean choosing a decentralised database architecture so that data is locally stored and erased as soon as possible, along with other technical privacy measures such as various encryption methods and openly publishing software code so it can be read, verified and improved by the community.
The responses to COVID-19 are a stark reminder of the risks of rapidly deploying nationwide digital initiatives. The rushed deployment of digital tools undermines privacy and public trust in the government’s ability to competently respond to crises. The consequences of overt, population-wide data collection can be unpredictable, uncontainable, far-reaching and permanent.
Kelsie Nabben is a researcher with the Blockchain Innovation Hub and Digital Ethnography Research Centre at Australia’s RMIT University. Jan Carlo Barca is a senior lecturer in software engineering and Internet of Things with Australia’s Deakin University. Marta Poblet is an associate law professor at RMIT.