Widespread tracking of health data from gig workers creates issues in ethics, equality, and data privacy
New research from the University of Michigan School of Information shows collecting health data from Indian food delivery workers is highly intrusive, has questionable privacy protections, and may not provide anything more than the appearance of protective health measures.
The researchers found that one-sided data collection from delivery workers creates a false perception of how COVID-19 spreads among different class groups, strengthening societal inequities.
The coronavirus pandemic has triggered many changes, especially regarding how society views and tracks health information. COVID has also altered the way small businesses, especially those in the service industry, interact with customers. The biggest change was a switch from in-person service to contactless interactions.
In March 2020, India enacted strict, nationwide lockdowns to try and curb the spread of the coronavirus. During the first few weeks of lockdown, food delivery platforms, Zomato and Swiggy, reported major losses of up to 75% of sales.
To combat fears of transmitting coronavirus, the platforms launched health monitoring and tracking technologies for their delivery workers. “The food delivery platforms announced that they were going to share the live body temperatures of food delivery workers on their platforms so that the customers would feel safe while they're ordering,” says Anubha Singh, UMSI doctoral candidate and lead author of the study.
Each morning, workers had to self-report their temperatures and declare any symptoms of illness. Delivery workers were also asked to take selfies upon request, showing their masks and the company logo on their shirts. When workers started to get vaccinated, that information was on full display as well; Platforms started showing digital vaccination records, including unique health IDs of the worker and other personal details.
Workers were also subjected to more serious data collection requirements, including being required to use a controversial contract-tracing app called Aarogya Setu, says Singh. The app ran in the background at all times, collecting location data of delivery workers.
“A lot of civil society advocates have raised concerns about privacy with this app,” says Singh, adding that details about how data is managed and protected remain unclear.
Singh and her coauthor Tina Park, head of inclusive research and design at the Partnership on AI, wanted to understand the impact of these health measures and the societal cost to the workers themselves. Singh collected information about new health tracking protocols from Indian newspapers and interviewed delivery workers over several months.
“Since the pandemic restricted travel and in-person interaction, I was able to get a sense of all the claims that these platforms are making from the newspaper articles,” says Singh. This allowed Singh to ask more pointed questions about the platforms’ COVID-related policies.
The researchers found that while health monitoring technologies were deployed in response to public health guidelines, the measures were more a “ performance of care” than actual protections of health. For example, temperatures don’t always indicate infection, nor does a mid-shift selfie of a delivery worker.
“While the food delivery workers were required to abide by all of these rules that were put in place by these companies, customers on the other hand were not required to do anything,” Singh says. “It was a very one-way narrative of disease spread endorsed by technology.”
She adds that while news outlets and the tech community celebrated health measures for contactless deliveries, their investigation revealed a bigger picture. “Our paper asks us to dig deeper into some of these ‘good feelings,’” Singh says. “Oftentimes, we don't recognize that the production of these good feelings are based on some kind of violence to a particular group of people or community.”
“I think a very important contribution of our work is to critically interrogate into things that look very equitable — or in this case, for the good of public health,” says Singh. She says once they scratched the surface of these health monitoring efforts, the real power imbalance emerged.
“The reality of India’s gig work is marked by a caste and class divide,” she says. Singh adds that these types of unequal health monitoring requirements endorsed by supposedly politically neutral technological systems “puts hegemonic relationships between people back into place.”
The researchers note their study is an example of how important it is to critically examine the promises of technological systems. They suggest when designing human-computer interaction technologies, understanding the underlying human costs is necessary to promote fairness, accountability and transparency in the system.
— Sarah Derouin, UMSI public relations specialist
Learn more about UMSI doctoral candidate Anubha Singh.