Pandemic, Privacy and Economic Resilience
September 3, 2020 | By Cindy Moehring
Virus-tracing software has become a controversial tool in the fight against the spread of the coronavirus, and its success or failure is largely in the hands of the tech industry.
The challenge has multiple layers, but there are two significantly difficult obstacles and each connects to ethics. One, the industry has to build an ethically responsible product for virus tracing. And, two, the industry has to earn enough trust with the public to gain widespread adoption of the products.
In theory, the technology will help more people avoid the virus and speed up testing of those who may have contracted it.
The software for tracing the virus comes primarily in the form of an app that’s downloaded to your smartphone. The app tracks those with whom you’ve come into close contact, provided they also have the app. It then can alert you or the other person if one of you is confirmed to test positive for COVID-19. Some apps also warn you if an infected person is nearby or if that person is following social-distancing protocols.
Tracing the Challenges of Tracing Apps
The first challenge tech companies face with these apps, however, doesn’t involve whether they work, but if they can work in an ethically responsible manner.
As this New York Times article points out, concerns about the security of the apps has created a “barrage of criticism from privacy advocates.” Many wonder if the apps’ ability to benefit public health outweighs the software’s threat to individuals’ privacy.
The apps can collect a massive amount of data, including precise locations, user’s health and social interactions. One fear is that hackers can too easily steal information and use it for scams or identity theft, or that oppressive governments will use the information in ways that have nothing to do with public health.
The second challenge for tech companies is to convince the public that the first challenge was solved and that everyone should download and use the apps.
Like any technology platform, virus-tracing apps need to achieve a critical mass of users to work effectively. Imagine if you were among only 5,000 people who were on Facebook, and the other 4,999 were scattered around the country. Not only that, but you know just 10 of the other users. The virus tracing software faces the same issue. If you have it and the infected person next to you in the grocery store doesn’t, it won’t help either of you at all. One research project estimated that at least 60% of the population needs to use the app for it to provide reliable results.
Adoption is far less likely if the public doesn’t trust the technology. The problem, however, is an overall lack of trust of major institutions that would be responsible for convincing the public that the tracing apps are safe, secure and won’t be used against them.
The 2020 Edelman Trust Barometer found that business and NGOs are trusted by 58% of people, which it classifies in the range of “neutral” trust. Governments and the media, both at 49%, fall in the distrusted category. Most people – 61% – fear governments don’t know enough about emerging technologies to effectively regulate them.
How can the tech industry help?
One way to earn trust is to prove there is sufficient accountability connected to the technology. A report from the Edmond J. Safra Center for Ethics at Harvard University indicates that tracing apps could maintain trust with the public if an oversight framework were developed and implemented. But a government-sponsored oversight committee alone won’t be enough.
The Edelman study found that when institutions work together, they are more likely to be trusted. And the Harvard report also notes that businesses, in general, need to lend trust and expertise to the public sphere to help move the globe through the pandemic as quickly as possible. That is a necessary precondition to economic resilience.
In the case of virus-tracing apps, the tech industry should be the group that establishes the oversight framework to help ensure privacy and security. The framework should apply to the industry overall, and it should include norms of governance and enforcement around the design and use of the apps.
As mentioned in the Harvard report, the framework should include maximum privacy protection, open-source code for auditing and a prohibition of commercial data usage. Once established, the tech industry should broadly publicize the agreed-upon common framework to ensure trust and encourage adoption.
With so many of the big tech companies under the microscope right now for the way they honor consumers’ privacy rights in other respects, gaining the public’s trust when it comes to tracking apps won’t be easy. But it also presents an opportunity for these companies to take significant steps in the right direction to increase trust by operating with transparency and demonstrating that they can create technology that serves the public good in one area without putting it at risk in others.