The paradox of virus contact tracing apps

7 Dec 2021 13:30h - 15:00h

Session page

Event report

In setting the scene of the discussion, Ms Jenna Fung (NetMisson.Asia) explained the main paradox of virus contact tracing apps: ‘We must either sacrifice our privacy for public health or put our public health at risk to preserve human rights; without such a technology our economic activity and movement continue to be restricted’. The speakers of the workshop exchanged ideas and practices from their respective regions.

Mr Prateek Waghre (Takshashila Institution) presented the viability rating framework for technological intervention to solve public policy problems like the pandemic. It evaluates tech capacity to complement pandemic management by population penetration, privacy, and effectiveness implications. In other words, the framework assesses the basic qualitative and quantitative parameters as well as the implications of tech use has on personal data protection. Waghre also used the term ‘technology theatre’ for instances when public policy and legislation are replaced by procurement processes and nuanced conversations about technical instruments, which is exactly what happens in many states with contact tracing apps. While studying dozens of Indian contact tracing apps, Waghre identified several policy considerations that the government should keep in mind: Whether the app is expedient, voluntary or mandatory, and whether it has strong legal data protection framework and policies of platform stores (mainly Google and Apple and introduction of their exposure notification frameworks). He concluded that monitoring public health has high stakes and tech measures need to be justified. 

Mr Elliott Mann (NetThing Australia) shared the Australian experience.  The contact tracing apps were developed in two stages. Firstly, the federal government took the Singaporean proximity tracing app based on bluetooth connection as a model, made special amendments to the privacy laws and, importantly, made the app voluntary to use. The next stage was when COVID-19 cases increased in Australia and lead to the creation of QR code tracing apps by each state and territory. QR codes were scanned to mark who was at the venue and create app alerts for when there were new infections. However, local QR based apps were not protected from excessive data use, like for police investigations. Mann concluded that ‘We are in situation where the app that has the best privacy protections (proximity tracking) nobody uses and it has not been proven to work. Whereas the apps that do work have no privacy protections at all and are being accessed by other entities’.

Finally, Ms Janaina Costa (ITS Rio) talked about the contact tracing apps in Latin America. She conducted the evaluation of apps in Mexico, Peru, and Colombia – based on the rule of law test, the rights-based test and the risk-based test – a framework previously used to understand the governance of digital identity systems. The study showed that none of the three apps performed well in all tests. Though there are some broad general data protection legislation, there were no provisions for the anonymisation of data,  or its letion in the future. In Brazil, they use heat maps to indicate places of people concentration based on telephone operators data on the location of mobile device. However, there are still  doubts about how this data is treated and whether it can be traced for each individual connected. Costa stressed that under Brazilian Data Protection law analysed and anonymised data is different terms. Anonymised data can be reversed back to become personal again. She posed the question: ‘How to create an anonymised database that can strike a balance between usefulness of those who use it and not revealing everyone’s identity?’. Costa concluded that ‘the data process for the generation of public policies or to fight the pandemic should only be used for this specific purpose. And if it is used for other purposes such as sending advertising or electronic messages afterwards, this use is completely illegal’.

The speakers also discussed whether contact tracing apps should be mandatory, focusing on their penetration and efficacy since such apps should not be a way to deny people access or rights in any shape or form.  

The workshop continued with discussions in breakout rooms where participants brainstormed issues further: What are the responsibilities of governments, businesses, the technical community, civil society, and the academic and research sector with regard to digital inclusion and respect for human rights? Another question was; how to promote equitable and peaceful societies with digital technologies and prevent their use for harmful purposes?

Some remedies voiced were:

  • Increasing public awareness of digital inclusion and human rights
  • Increasing education and learning between stakeholders
  • Increased transparency in data policies
  • Minimisation of data collection – business needs to think beyond profits 
   By Ilona Stadnik 

Session numbers and graphs

Most frequent noun chunksMost frequent names and entitiesWordcloudProminent verbs with adverbs

Automated summary

Diplo’s AI Lab experiments with automated summaries generated from the IGF sessions. They will complement our traditional reporting. Please let us know if you would like to learn more about this experiment at ai@diplomacy.edu. The automated summary of this session can be found at this link.