Susan Landau, a Tufts University professor in cybersecurity and computer science, is the author of People Count, a book on how and why contact tracing apps were built. She also published an essay in Science last week arguing that new technology to support public health should be thoroughly vetted for ways that it might add to unfairness and inequities already embedded in society.

“The pandemic will not be the last humans face,” Landau writes, calling for societies to “use and build tools and supporting health care policy” that will protect people’s rights, health, and safety and enable greater health-care equity.

This interview has been condensed and edited for clarity.

What have we learned since the rollout of covid apps, especially about how they could have worked differently or better? 

The technologists who worked on the apps were really careful about making sure to talk to epidemiologists. What they probably didn’t think about enough was: These apps are going to change who gets notified about being potentially exposed to covid. They are going to change the delivery of [public health] services. That’s the conversation that didn’t happen.

For example, if I received an exposure notification last year, I would call my doctor, who’d say, “I want you to get tested for covid.” Maybe I would isolate myself in my bedroom, and my husband would bring me food. Maybe I wouldn’t go to the supermarket. But other than that, not much would change for me. I don’t drive a bus. I’m not a food service worker. For those people, getting an exposure notification is really different. You need to have social services to help support them, which is something public health knows about. 

Susan Landau
Susan Landau

COURTESY PHOTO

In Switzerland, if you get an exposure notification, and if the state says “Yeah, you need to quarantine,” they will ask, “What’s your job? Can you work from home?” And if you say no, the state will come in with some financial support to stay home. That’s putting in social infrastructure to support the exposure notification. Most places did not—the US, for example.

Epidemiologists study how disease spreads. Public health [experts] look at how we take care of people, and they have a different role. 

Are there other ways that the apps could have been designed differently? What would have made them more useful?

I think there’s certainly an argument for having 10% of the apps actually collect location, to be used only for medical purposes to understand the spread of the disease. When I talked to epidemiologists back in May and June 2020, they would say, “But if I can’t tell where it’s spreading, I’m losing what I need to know.” That’s a governance issue by Google and Apple.

There’s also the issue of how efficacious this is. That ties back in with the equity issue. I live in a somewhat rural area, and the closest house to me is several hundred feet away. I’m not going to get a Bluetooth signal from somebody else’s phone that results in an exposure notification. If my bedroom happens to be right against the bedroom of the apartment next door, I could get a whole bunch of exposure notifications if the person next door is ill—the signal can go through wood walls. 

Why did privacy become so important to the designers of contact tracing apps? 

Where you’ve been is really revelatory because it shows things like who you’ve been sleeping with, or whether you stop at the bar after work. It shows whether you go to the church on Thursdays at seven but you don’t ever go to the church any other time, and it turns out Alcoholics Anonymous meets at the church then. For human rights workers and journalists, it’s obvious that tracking who they’ve been with is very dangerous, because it exposes their sources. But even for the rest of us, who you spend time with—the proximity of people—is a very private thing.

“The end user is not an engineer… it’s your uncle. It’s your kid sister. And you want to have people who understand how people use things.”

Other countries use a protocol that includes more location tracking—Singapore, for example.

Singapore said, “We’re not going to use your data for other things.” Then they changed it, and they’re using it for law enforcement purposes. And the app, which started out as voluntary, is now needed to get into office buildings, schools, and so on. There is no choice but for the government to know who you’re spending time with. 

I’m curious about your thoughts on some bigger lessons for building public technology in a crisis.

I work in cybersecurity, and in that field it took us a really long time to understand that there’s a user at the other end, and the user is not an engineer sitting at Sun Microsystems or Google in the security group. It’s your uncle. It’s your kid sister. And you want to have people who understand how people use things. But it’s not something that engineers are trained to do—it’s something that the public health people or the social scientists do, and those people have to be an integral part of the solution. 

CONTACT US

We're not around right now. But you can send us an email and we'll get back to you, asap.

Sending

Log in with your credentials

Forgot your details?