Contact tracing is a key mechanism for monitoring the evolution of communicable diseases. For instance, it is routinely used in the case of sexually transmitted diseases, to trace people who may have been infected, and to urge them to get tested and take precautions to avoid infecting others. Other applications include tuberculosis, measles, chicken pox, HIV and Ebola.
Contact tracing was also used, by many countries, as part of the effort to monitor and control the spread of Covid-19. In the UK (like many other countries), initially this was done manually. Manual contract tracing works this way: A trained health professional interviews the infected person, and prompts them to remember where they were in the days or weeks before the diagnoses, and who they interacted with during that period. This process has various weakness – the biggest of which, as far as Covid-19 is concerned is that it is very slow, and that it will fail to capture many (if not most) of the contacts potentially exposed to this airborne virus. Hence, the UK government, like many others around the world, introduced a smartphone app to monitor contacts and alert users to potential exposure to the virus.
While up to 56% of the eligible population has downloaded the app, this is below the minimum needed for effective control of the disease. Many other countries faced the similar problem.
I have previously mentioned a study, which showed that one of the main concerns with contact tracing apps was that the personal data hence collected could be misused by governments.

This seems to be a generalised concern, according to YouGov data on attitudes towards contact tracing apps across 26 countries, which asked: “What is your main reason for NOT wanting to provide contact information?”.
The response options were:
1 I would not want the government to track me
2 I would not want technology companies to track me
3 I cannot be infected with Coronavirus (COVID-19)
4 It does not help the fight against Covid-19
96 Other
99 Don’t know
As can be seen in the graphic below, in most high-income countries, people are more willing to trust tech companies with their sensitive data than they are willing to trust the government. In the graph, brown dots show resistance to downloading the app because of concerns with sharing sensitive data with the government, whereas green dots refer to tech companies.

I found this a very interesting example of the privacy paradox in action. What we are seeing here can not be justified by concerns about the specific data themselves – otherwise, we would be equally upset by private companies having access to such data. Likewise, these responses can not be justified by the trade-offs about the risk of sharing our sensitive data on the one hand, and access to benefits on the other, because there is nothing that we will gain from the tech companies by giving them access to these data (e.g., about where we have been to, whom we met, and other apps on our phones). Or is it the perception that our personal data will be safer with the tech companies than with the government?