4Q4 Podcast, Grants, Interviews
Jeffrey Warren of Public Lab explains how a new geolocation system could change the playing field for environmental and human rights activists.
Digital Impact 4Q4: Jeffrey Warren on Geolocation Privacy
SUBSCRIBE TO THIS PODCAST ON iTUNES.
00:00 CHRIS DELATORRE: This is Digital Impact 4Q4, I’m Chris Delatorre. Today’s four questions are for Jeffrey Warren, Co-founder of Public Lab, a nonprofit network of citizen scientists dedicated to creating awareness and accountability around environmental concerns. With funding from a 2018 Digital Impact grant, the Public Lab team is developing a set of management tools for privacy-first, user-friendly data input, archiving, and sharing among social sector organizations. The Community Data Privacy Toolkit, or CDPT, would mitigate the abuse and misuse of location data, a growing challenge that puts activists and the communities they serve at risk.
00:51 CHRIS DELATORRE: Jeff, this is a topic most of us might be too intimidated to explore on our own. But location tracking affects everyone with a mobile phone—through Google Maps and apps for social networking, banking and weather, to name a few. If an invisible “someone” knows where we are at every moment, how can we better protect ourselves? What weaknesses should we be aware of?
01:16 JEFF WARREN: Well, I think location privacy is something that has been increasingly in people’s awareness. The New York Times has done a couple of great pieces where they’ve used actually I think either leaked or court-ordered released data showing literally where tech companies—exactly what data points companies have on our location on a day-to-day or even minute-by-minute basis. And it’s, it is scary to see how they can pinpoint what building you’re in, how long you’re there, the route that you drove and where you stopped. And there certainly are promises of responsible use but it’s maybe more comforting for some people than others.
I think when it comes down to it, there’s two parts of it. One of them is we do need better transparency, we do need to demand better transparency. We’ve gotten a few more tools now where you can set location access with more granularity on apps, with recent versions of Android, for example. But beyond putting your phone in a metal wallet or something—I’m getting a little paranoid in that respect but—you don’t have a lot of granularity. So I think the real responsibility is on the people who are designing systems and that’s why with this project we really sought to come up with a framework—I mean we develop specific tools but really what we’re trying to do is develop a set of norms around location privacy. And get people thinking about the privacy of their location as opposed to just their social security number.
The CDPT includes tools to produce and manage semi-anonymous personal data and geodata and to view and display such data, many of which are key to protecting environmental and human rights advocates.
I think people tend to think of location privacy as all or nothing, on or off. And that’s actually literally how apps are built for the most part. Like I think the only granularity you get now on Android is whether it can only access your location while the app is on or whether it can do it anytime it wants. Whereas what we’ve come up with is kind of a zoom level based granularity. So you could share a more specific location or less specific location. And that dimension I think is a really powerful way to think about location privacy. Because it shouldn’t be an all-or-nothing debate.
03:41 CHRIS DELATORRE: Josh Levinger’s work in Gaza while at the MIT Center for Civic Media is one example of how blurring location data can keep individuals safe. But traditionally vulnerable populations aren’t the only ones at risk. Let’s say you’re organizing around an environmental catastrophe—like an oil spill—or responding to a human rights crisis in a dangerous part of the world. What is it that makes location blurring so effective for regional activist networks—for human rights defenders and environmental activists who put themselves in harm’s way?
04:17 JEFF WARREN: You’re totally right. Public Lab is an environmental organization. We worked with environmental justice groups on pollution, issues that affect people, where health and people’s wellbeing is at risk. And so the privacy element is really important but environmental catastrophes are fundamentally spatial. So you could be upstream or downstream from something. You could be upwind or downwind of a plume of smoke. You might be at a relationship to something else in a given watershed so the groundwater is flowing across something that you’re worried about. So, those sorts of spatial distinctions are really important to be able to communicate and organize around.
And so what we really wanted to do was to create a vocabulary around the sharing of partial location. Which is to say I’m not going to tell you exactly where I am but I’m going to give you a general sense that I am in a neighborhood, you know, downstream from this other place, or something like that. Because people need to make spatial arguments in their advocacy work in order to hold polluters accountable and in order to record that kind of harm. So, the variable location or blurred location approach, it should be kind of familiar. I mean, you know, you could imagine if you blur someone’s face, you can tell it’s a person. And you can sort of generally see something about them without seeing exactly who they are. You might be able to tell some things about them. Or if you obfuscate data you might round the data so you’re not giving such a precise location. And that rounding is exactly what we’re doing with blurred location. We’re rounding the coordinates.
06:10 CHRIS DELATORRE: Public Lab’s approach to project development emphasizes community engagement, specifically through the use of open source code. It’s creating a seamless user experience that’s been particularly challenging for you. You’ve explored a number of existing options, from postal codes to Airbnb’s “in this area” model. What benefits have you found from using a community-based model to propel this technology forward?
06:39 JEFF WARREN: Well, what we’ve found is that there’s an enormous energy from what is typically a younger and more diverse community of people around the world who are learning how to code, and more than that they’re setting new norms for how to treat one another. They’re really flying in the face of the status quo, in fact. And by creating spaces that are really welcoming and mutually supportive we’ve been really lucky to work with hundreds of new programmers to build systems like this one.
But to your question, I think it goes deeper than just being more diverse and inclusive. It really has ramifications for how code is written and how systems are built. By contrast you can look at, say, facial recognition or other systems that are envisioned and developed primarily by the technology community, which is primarily white, primarily male, primarily, say, in the Bay Area, for example. And their sort of failure to understand really critical issues around facial recognition that get at privacy, they get at vulnerability, they get at rights and things like that that just maybe aren’t front-of-mind for that group of people in the way they are for other demographics.
The New York Times: “Your Apps Know Where You Were Last Night, and They’re Not Keeping It Secret.”
Tech companies use smartphone locations to help third parties, including advertisers. They say it’s anonymous but the data shows how personal it is.
So, by having a more diverse group and inviting people in an open source model, where anyone can learn the skills they’ll need to contribute and are supported and welcomed in, we get a much more diverse group of contributors working on this code and that informs the actual work. They’re not just executing the work that we’ve written down line by line or something. They’re bringing new ideas, new perspectives to the work and I think that really does play into how the systems we build —how they work, how they’re designed, how they look. I would note that I think there are real challenges to extending that into the space of design. Because the open source model is primarily one of writing code whereas a lot of what we’re dealing here is about doing design—interface designs or user experience design work—that is very smooth and effective and intuitive.
So, I think we’re still learning about how to do a community-based model for design work. But, you know, I think it’s great that we’re trying and I think we’ve made a lot of progress on the code side to start with and to build off.
08:54 CHRIS DELATORRE: Humanizing geospatial technology seems to be the biggest challenge here – to go beyond the science community, to share scenarios in the real world where it’s useful. Here’s how you put it on your website:
We’re interested in tools that can offer people in online spaces the ability to organize, coordinate, and communicate in regional scopes, while placing the decision of how precisely to share location in the hands of those whose privacy is at stake.
How can we communicate the importance of local control of geospatial data, both to technologists and the data science community, and also to people who want to protect their privacy?
09:37 JEFF WARREN: Well, I think it’s a challenge. I think a lot of the prioritization of these issues comes from demands from people who are using the technologies. And if you think about in academia how a lot of these issues are addressed, they come from an awareness that people are vulnerable or might be, you know, say the institutional review board process, which I think is important ethically but it doesn’t really—it’s not really a process which is run by communities who have the biggest stake. It’s run by the institutions. So, I think there’s some kind of inverting of that model which has to happen for this to be effective. And that’s something Public Lab is very invested in— in the community science model where communities are really at the center of the process and are calling the shots, so to speak.
So there’s that inversion. I also think there’s a couple basic things, like don’t make a mysterious algorithm when it’s not needed. You know, people think, well, we’ll have this model and it will be really complex for managing the privacy, and the user won’t have to think about it, and so forth. No, I think the technological systems that we create should follow the human understanding. So, it’s not like we have some sort of crude metaphor or just sort of something happening behind a curtain but you can actually see how it works and hopefully if it’s well designed it’s simple enough that you’re not juggling a lot. You should be able—in this case, with location privacy—you should be able to understand and act on the amount of privacy you get without having to do a calculation or something, without having to run code or anything like that. And I think we use the simple model of truncating the latitude and longitude coordinates so you can just glance at the coordinates and you get a sense—it’s not hard to translate that into an amount of privacy. So, I think that idea that what’s happening in the backend, we’re storing these latitudes and longitudes—it’s not being done differently than is being shown to you. There’s a commonality. And so with that comes an honesty about how much information has actually been stored.
Often when we think about privacy or we think about designing systems, we think about what is shown to people as being different from what is actually stored. And that shouldn’t be the case, you know? If you are telling people that you’re only—if you’re showing people that you’re only storing this much privacy, it should not be the case that in your database, you actually have a lot more precision and you’re sort of the master of that extra secret information. No, it should be transparent through from what’s shown to the user down to what’s in the backend. So, I think that’s really important.
I guess in that sense that mix of transparency and accountability, local community control, those are all part of the Public Lab model writ large. So, I think it’s an important thing to note that this idea of location privacy is just one facet of a broader set of work that Public Lab does and it sort of speaks to the values. If you’d like to learn more about Public Lab, you can follow us on Twitter @publiclab. And as I’m stepping away from Public Lab, you can continue to follow my work at @jywarren on Twitter. And you can learn more about Public Lab in general at publiclab.org.
13:05 CHRIS DELATORRE: Jeffrey Warren, Co-founder of Public Lab, thank you.
Digital Impact is a program of the Digital Civil Society Lab at the Stanford Center on Philanthropy and Civil Society. Follow this and other episodes at digitalimpact.io and on Twitter @dgtlimpact with #4Q4Data.