news
Global Health Centre
29 April 2020

Contact Tracing Apps: Extra Risks for Women and Marginalized Groups

The COVID-19 lockdown has proven economically devastating, and to start national economies moving, many governments are exploring digital contact tracing. Mobile apps can enable real-time health surveillance and case management. However, once it exists, that data on health and individual movements can pose real threats.

The COVID-19 lockdown has proven economically devastating, and to enable people to move freely and start national economies moving also, many governments are exploring digital contact tracing. Mobile phone apps that track individual movements can enable real-time health surveillance and case management. However, once it exists, that data on health and individual movements can pose real threats for everyone—particularly for women and girls, and for marginalized and disfavored groups. Racing to embrace digital contact tracing without putting laws and policies in place to address the stigma surrounding the epidemic, and to protect the rights of those most marginalized, risks undermining the goal of epidemic control.

Groups that have historically experienced discrimination, stigma, and abuse may be exposed to greater risk of persecution as data on coronavirus transmission circulates and is discussed in the public sphere. Much of the public debate over digital contact tracing apps, rapidly unfolding in Europe, has so far focused on individual privacy, and on whether it can be protected with technical solutions. [1] These are important questions, but it sidesteps the harder problems linked to widespread stigma and blame. In some countries, states now impose criminal sanctions on transmission of the coronavirus, though UNAIDS warns, based on experience in the HIV epidemic, that criminal sanctions will undermine, not help the response. [2] Even anonymized data raises the risk of persecution once it circulates in a climate of fear and stigma.

Contact tracing is a classic public health intervention often used successfully in epidemic control; for instance, in the case of Ebola. [3] It involves trained health care personnel interviewing an individual (an “index case” known to have the virus) to identify that person’s movements during the time when that person was infectious, and other people who may have been exposed. The health care worker then directly, and privately, contacts any individuals at risk to encourage them to test and take steps to prevent transmitting the virus again. The data is, or should be, protected by strict rules on medical confidentiality; though despite these, there are well-documented problems of health sector discrimination against people living with HIV. [4] The stubborn persistence of these forms of medical discrimination, more than 30 years into the history of the HIV epidemic, should cause some concern about how such data may fare in the wider public.

Digital apps offer the enticing possibility of doing this contact-tracing work in real time and on a much larger scale. As part of their responses to the COVID-19 outbreak, China, South Korea, and Singapore are using apps that employ various combinations of GPS, Bluetooth (which exchanges signals between two mobile phones within a certain distance range), and wifi signals to locate individuals and alert those who may have been in contact with confirmed COVID-19 cases. [5] In China, technology has integrated artificial intelligence with infrared thermal technology to detect temperatures and facial recognition software to identify individuals at risk; facial recognition cameras are widespread in public places there. [6]

Other countries—including the UK, Switzerland, and members of the European Union—are now moving quickly to develop their own digital contact tracing applications. [7] European countries are favoring Bluetooth-based “handshakes” between mobile devices, some preferring a centralized approach (with data made accessible to health care personnel), while others opt for a decentralized approach. Google and Apple have joined forces to build Bluetooth-enabled contact tracing in billions of phones globally; while they have committed to maintaining “strong protections around user privacy”, observers argue it may still be possible to infer the likely source of a given exposure. [8]

The specifics of the technology can significantly shape the risks, but setting these specifics aside for a moment, there are three concerns that arise as we consider how the speed and scale of mobile apps might exacerbate the risks that already inhere in contact tracing.

First, as the epidemic has triggered fear and stigma, as well as new laws criminalizing transmission, if even anonymized data gets into the public domain, members of the public will draw inferences—whether correct or mistaken—in an effort to pinpoint blame. South Korea’s Coronamap website shows the travel histories of anonymous confirmed patients, identifying them only by gender and age; text messages are sent to individuals to warn them of specific contacts and their movements. [9] But as even this limited information has circulated in public, individuals in South Korea have been accused of infidelity, fraud, sex work, and there have been online witch hunts to track down individuals blamed for spreading the virus. [10]

These kinds of accusations could create serious risks for women and girls in the context of gender inequality and violence against women in many countries. UN human rights experts and the World Health Organization (WHO) have warned that the COVID-19 crisis may exacerbate intimate partner violence, because distancing measures, school closures, and the economic downturn increase family stress while isolating victims from family, friends, and social networks, and hiding abuses from view. [11] General Comment No. 14 on the Right to Highest Attainable Standard of Health, the authoritative interpretation of the human right to health, calls on states to integrate a gender perspective in health-related policies, planning, programmes, and research. [12] Health and movement information that triggers speculation about women’s activities by their families, partners, or neighbors, could expose those women to discrimination and violence.

Similarly, publishing location data about COVID-19 outbreaks, which will now be available on a significantly larger scale thanks to digital contact tracing, may intensify stigmatization and blame towards specific minority groups. The COVID-19 crisis has already heightened xenophobia, racism, and religious tensions. For example, dozens of Africans living in Guangzhou, China have reported being evicted from their homes and other discriminatory treatment because of a mistaken association with COVID-19 transmission. [13] People of Chinese and other Asian descent, Roma people, and Hispanics have reported violence and verbal abuse linked to fears and angers about the virus. [14] Muslims have been attacked in India after a mass gathering of Tablighi Jamaat was linked to nearly one-third of India’s coronavirus cases. [15]

In South Korea, individual businesses were associated with COVID-19 transmission after they were identified through contact tracing, and some were targeted for extortion. If digital apps begin to accelerate the pinpointing of outbreaks in specific neighborhoods, communities, or religious groups, it could lead to further stigma or even violence against those groups. It is concerns like these that have led WHO to call on states to put in place “special measures…to ensure protection from discrimination and to ensure access to information, social services, health care, social inclusion, and education for vulnerable groups in national COVID-19 responses”. [16]

These problems have already arisen in the global HIV response. Groups whose behaviors are criminalized and stigmatized face increased risks in health surveillance and research: studies that mapped sex workers to target HIV services to them have exposed that population to arrest. [17] As Allan Maleche and I described in this journal, Kenyan key populations also vehemently opposed biometric data-gathering (such as iris scans or fingerprints) out of fears of exposure and arrest. [18] While this has been a longstanding concern in traditional contact tracing, digital apps will take data normally managed by a small number of health care workers, and make it far more widely available for public discussion.

There are groups that will avoid use of contact tracing apps as a result of these risks—the uncounted, who tend to hide from state scrutiny. [19] In many countries, undocumented migrants and others whose identities or behavior place them in legal limbo will likely go to great lengths to avoid having their data captured for fear of identification, arrest, and expulsion; the same populations are most likely to be vulnerable to COVID-19, due to crowded housing and lack of access to water and health information. In Singapore, for example, the widely-praised and digitized COVID-19 response was undone in part because of a failure to capture and meet the needs of hundreds of migrant workers living in cramped dormitories. [20] Another group that may hide from digital scrutiny are the elderly, who may be reluctant to come forward out of fears of not surviving discriminatory triage in hospitals, or out of fears of being placed in nursing homes where there have now been numerous hidden deaths. [21] As I discuss in my forthcoming book, The Uncounted, populations that are criminalized and stigmatized (such as sex workers, people who use drugs, or LGBT people) have often similarly avoided being counted or identified in HIV studies out of fear of arrest. This can create a data paradox in which lack of data results in lack of resource allocation to health services, reinforcing the lack of data. [22]

Overcoming these challenges requires states to build trust through upholding fundamental human rights principles of transparency and accountability. States preparing to embrace digital contact tracing must clearly articulate who owns the data, where that data is stored, for how long, and for what purpose. Who has the right to use this data to make decisions, and who is accountable for algorithmic decision-making? What measures will be taken to protect women and girls, as well as stigmatized groups? Will there be a right of appeal in cases where an individual is wrongly tagged as a COVID-19 case, as happened in China and elsewhere? [23] States may restrict certain rights during an emergency, but under the Siracusa Principles, these must be “provided for by law, strictly necessary, proportionate, of limited duration, and subject to review”, and currently many countries lack even basic data protection laws. [24]

Health data is not neutral: it is embedded in political contexts, can be shaped by politics, and its collection and use in decision-making has life-or-death effects. While we are all desperate for solutions to the COVID-19 crisis, history, including decades of the HIV response, has shown that addressing epidemics requires thoughtful consideration for how the rights of those most marginalized undermines their participation in the programmes and strategies devised for the health of all.

 

Written by Sara Meg Davis, and originally published by the Health and Human Right Journal