The privacy implications of contact tracing

Caroline McCaffery
October 23, 2020

During the Bubonic plague, a red cross or ‘X’ was placed on the door of houses that had the plague. Whether a warning to the healthy public or a marker for the doctors and morticians, this type of notice is similar to contact tracing today. Now that we are more than 2 months in to the COVID-19 crisis in the US, the shock of an abrupt quarantine has worn off and has become downright tiresome. People want to start moving around again, at least, those in less populated areas feel that way. But the big question is, how do we do it without increasing pressure on our health system?

Regardless of how the state governments are handling it, re-entering society for people will be different because each person has a different level of anxiety. I am a person with high anxiety. In fact, I spent the first 2 weeks of the US crisis reading everything I could get my hands on. My thirst for knowledge consumed me, especially about how to keep my family safe. Now that we are talking about easing restrictions, I again research constantly and have begun arguing for a formation of “trust circles.” A trust circle is a group of friends, all quarantining separately, who then decide to forego physical distancing from each other. For example, a family of four who has not left their house except to go to the grocery store may decide to hang out with another family of four who has behaved similarly. Once you know who has done what and their likelihood of getting sick and thus getting you sick, then you can decide whether or not to let them into your trust circle. And once that person or family is in the circle, you will now interact with them on a non-social distancing basis i.e. cook them dinner and eat it with them. I believe trust circles will be like concentric circles, slowly growing wider and wider over time.

Much like the aforementioned person to person trust circle, businesses will also have to contend with how employees can start trusting one another again. It may start with building that inner trust circle. For example, a small manufacturer can ask the employees on the ground to share their risky behavior to see if they can form a trust circle and work together again. It is more difficult for public businesses, like a restaurant or a hair salon but a hair salon has regular, trust-worthy, known clients and can ask the all too important question of “have you been quarantined for more than a couple of weeks?” In the restaurant industry, any patron may walk in and so a trust circle is just not realistic. State and local governments grapple with larger issues, such as mass transportation and homelessness. In these situations, trusting who you expose yourself to is nearly impossible.

You see, trust circles are really a non-technical, small-scale method of avoiding the disease through contact tracing. Technology is currently being developed to assist in building larger trust circles with contact tracing, but, as usual, technology raises privacy and security questions. Google and Apple have launched a partnership to develop an API that provides decentralized contact tracing at large-scale. Application developers can build contact tracing apps on top of the API. The initiative, by promoting a decentralized system, is meant to protect privacy and security. An unlikely pair based on their competitive positions, Apple and Google are facilitating a privacy first approach, but that doesn’t mean the developer building on top of the API will continue to build in privacy by design or that the development of an app outside of this collaboration does not consider privacy and security.

Last week, I attended a conference on privacy and security and contact tracing apps were a hot topic. In order to determine if contact tracing apps violate privacy, we first need to determine what information they are collecting. The purpose of a contact tracing app is to provide early warning detection of possible COVID-19 exposure to an individual user. With this purpose in mind, the app needs to be able to collect who you came in contact with, for how long, where and when. It also needs to know if a user of the app is or becomes sick with COVID-19 and a method for communicating with you if someone you came into contact with did become sick. It is basically a trust circle, except in this case you are trusting the technology not to use your data for any other purpose other than to notify you that you were exposed.

It will come at no surprise to you, the reader, that I am not that trusting. It seems almost certain that most app developers will also send my data to local and federal centers for disease control. Maybe they “anonymize” or “aggregate” the data so that my individual name is not sent and they are just tracking number of cases, but clearly they will want to know where exactly cases are in order to interpret if there is a local outbreak and prepare hospitals. But I don’t think the data sharing will stop there. Data will be retained for historical purposes; to look back on this pandemic in a few years and analyze it for future pandemics. I think there may be even more data that an app developer can obtain and share, like who wasn’t diagnosed, who was female vs. male, race, income brackets, time from first symptoms to diagnosis and recovery or death. What about the hospital that treated me if I have to go? If they are tracking my location, they must be able to pull that data too. Going another step further, perhaps the local and federal centers for disease control further share my data with scientists and researchers who are trying to understand the patterns of COVID-19. Privacy laws generally favor that type of sharing, so the app developer certainly would not be wrong for permitting such data sharing. But do I care that the original limited purpose of using the app, so I can know if I was exposed, is now providing all that other information to all those other people?

The reality of the situation is that most of this data is already “out there,” whether you are checking Worldometer or Johns Hopkins, they have to be getting their data from hospitals and testing centers, who likely also have the patient’s name, because it was coupled with the credit card receipt or insurance card the patient used. At least HIPAA controls that data usage.

So, should we just give in to these contact tracing apps? Well, I don’t know. At least when I go to the hospital, I don’t hand over a list of all of my contacts to them too. The testing center also doesn’t know what subway I take in the morning or if I eat at Chipotle instead of getting a salad. I am quite sure insurance companies and advertisers would like to get their hands on that kind of individualized data, even if it is only traceable back to the device “anonymized ID” rather than the name of the device owner.

What I do like about what Google and Apple are promoting is the ability for the user to control the data collection. If an app relies on Bluetooth, a user only needs to toggle that Bluetooth off for the app to no longer communicate externally. Personally, I feel more comfortable when I have more control over the secondary and tertiary uses of my data. If you are anxious, like me, then you would toggle it on at the grocery store to make sure no one else who is there at that same time is infected. But then you would toggle that app’s Bluetooth connection off at home. But that isn’t enough control for me. If they add that I can delete the fact that I went to the grocery store two weeks later from the app, that I can add and delete my contacts who were in my presence one by one, that I can control which data of mine is being sent to other third parties or that “critical health data” is all that is shared with third parties and I know who those third parties are; if I can feel confident that there will be no backup tapes containing my deleted information and that hackers cannot get into the app because it is so controlled; if I knew that my data could not be sold without my prior consent; and if I knew that I wasn’t signing myself up to some lifelong surveillance tool used by governments and businesses for the rest of eternity, well, now I might be more confident about installing a contact tracing app.

I still read a lot about COVID-19. I also have trusted sites that I regularly visit, most of them being the ones that are showing the number of cases and the infectious rate. I think that if we had had a contact tracing app at the beginning of the crisis, we may have seen adoption, but with the number of cases slowing, I suspect that most people will not download a contact tracing app now. Without wide-spread adoption, its efficacy is non-existent. Frankly, I have always been worried that we would fulfill the predictions in Minority Report. Where we put our trust in very few people, unwittingly giving them great power and then that power corrupts them. I can see the potential for that with contact tracing using technology. But I also appreciate that COVID-19 has caused most of us to distrust everyone and everything around us and we need to find a way to trust again, for the sake of humanity. I don’t know. What are you going to do? Download the app? Which one? Or are we destined for a world where we are willing to take more risk with our health and not our privacy?

I do plan to build a trust circle. We have been talking about it with some family members and our assessment of the risk is that it is quite low. We know our family members patterns, such as mask and glove wearing, standing 6 feet away from people and wiping down everything. But I am not sharing my family’s actions with other people. The benefit of the person-to-person trust circle is that I can control who I let in and what data I share. Trust can only be established if there is respect both ways. Similarly, I think technology can help promote societal health and privacy, but I am going to be super careful about which one I give my trust to because I need to know they will respect my privacy, too.