Most Americans are currently under a stay-at-home order to mitigate the spread of the novel coronavirus, or COVID-19. But in a matter of days and weeks, some U.S. governors will decide if residents can return to their workplaces, churches, beaches, commercial shopping centers, and other areas deemed non-essential over the last few months.
Re-opening states will require widespread and immediate coronavirus testing, which at this time may not happen as some of the standard supplies, like cotton swabs, are still in demand. A comprehensive plan for COVID-19 contact tracing, which is actively tracking and monitoring people potentially exposed, is also required before resuming some level of normalcy. For contact tracing to be effective, the Centers for Disease Controls (CDC) recommends that individuals potentially exposed be quarantined, large cadres of people be deployed as formal contract tracers, and that digital health tools be used to expand the reach and effectiveness of these workers.
This week, Apple and Google paired up to respond to the call for digital contact tracing, which would involve subscribers voluntarily downloading an app. Both companies issued press releases about the partnership, which would first release APIs to enable interoperability with apps from public health authorities in May. The next phase of the joint project will involve a Bluetooth-based contact tracing platform to allow for more interactions between individuals who opt in and public health authorities. Both companies have asserted that the design does not collect location data or personal or health data from anyone without a positive COVID-19 diagnosis.
Privacy and bias risks
While it is seemingly clear that widespread contact tracing and surveillance can help identify coronavirus cases and possible hot spots for new and recurring infections, several questions remain. The first one is related to the security and anonymity of one’s personal data. Both Apple and Google have proposed that the use of Bluetooth-enabled technology will obscure the personal identities of the infected person and the people in near proximity. However, more discussion is needed on just how anonymous the data is and whether it can be easily de-anonymized, which may discourage individuals from downloading the app. The platform also needs to ensure that the collected location data won’t engender inferences about the infected person and his or her environment, i.e., the use of one’s location as an indicator of neighborhood quality.
Second, who has access to the data also matters. While both Apple and Google have made assurances around their respective companies’ handling of collected data and the intent to stop tracking once the pandemic has ended, what expectations have federal and local public health authorities shared around their data collection and use? How long will the data be retained, and the longer that it is kept, what is the risk of being used for other purposes? In the absence of current federal privacy legislation, these are all important considerations.
In a worst case scenario, communities that exhibit more cases of the coronavirus infection can be subjected to geofencing by public health officials, which can be enabled through location tracking to place limitation on the mobility of residents. Some countries are deploying digital tools for massive surveillance and restrictions on the mobility of their citizens. Because there is a finer line in U.S. democracy, the proposed solution from Google and Apple relies upon advanced cryptography, where randomly generated IDs from devices are distributed through Bluetooth signals to others with the app. Despite these advances in the technology, raising transparency concerns with government agencies and potential third-parties is still important; they will ultimately be responsible for the results.
For these reasons, full encryption and cryptography of collected health information for those who are infected and the people with potential exposure must be the standard. Without the possibility of an enticing “back door” into the app, individuals that opt in to use the service will be better served and protected from potential misuse by government and other companies.
In such uncertain times, the prospect of digital health surveillance will become a plausible supplement for the thousands of physical contact tracers that may be overrun by the demand for their services. Other technology companies are also in discussions with the federal government about leveraging similar resources, like cell phone data or mobile advertising data, to fight the coronavirus. And, certain government agencies are also using cell phone data to monitor the movement of individuals.
Any such use of digital tools should continue to raise legal and ethical questions around privacy to avoid unintended consequences for the people being helped. The conversation about the privacy risks should lend itself to a broader conversation on inequality, especially racial and ethnic profiling and the digital divide.
COVID-19 infections and fatalities have hit where majority-black and brown cities the hardest where poverty, lack of access to quality health care, dense living situations, and higher rates of pre-existing medical conditions exist. Contact tracing among individuals who live or spend a large portion of their time in these communities will indicate a higher likelihood that the people around you are prone to the virus, which could stigmatize rather than support them. Moreover, the recent revelation that the proposed smartphone apps may not work on older devices could inadvertently affect these communities where the cost of technology and access to devices are quite prohibitive.
Engaging in public health surveillance will be critical to reduce current and future outbreaks of COVID-19. But any supplement to traditional practices must be done in ways that ensure security, transparency, and more importantly, equity, especially at a time when the U.S. is expeditiously working to keep the curve of infections flat.
Dr. Nicol Turner Lee is a fellow in the Governance Program’s Center for Technology Innovation at the Brookings Institution. Jordan Roberts in the health care policy analysts at the John Locke Foundation.