Pierre Bamin, Unsplash (CC BY 2.0)

DI Podcast, Interviews

When Choosing Digital Tools, Consider Context

The Engine Room's Laura Guzman says, "It's most important to weigh costs, to consider risks, and to look at other contextual factors."

DI 4Q4 Laura Guzman: When Choosing Digital Tools, Consider Context

Listen using the audio player, visit us on iTunes, and tweet @dgtlimpact with #4Q4Data.

[00:00] Chris Delatorre: This is Digital Impact 4Q4, I’m Chris Delatorre. Laura Guzman manages communications for The Engine Room, an organization that helps the social sector make the most of data technology to push for social change. Now, The Engine Room is looking at how organizations and activists across the Responsible Data community are using data protection legislation to advance digital rights.

One such piece of legislation—the GDPR—is putting pressure on advertising technology or ad tech, what Guzman describes as a “quietly booming” industry based on a system of profiling and real-time bidding.

Guzman writes, “In some cases, users are labelled with sensitive categories based on their religious beliefs, ethnicity, sexual orientation, or private health conditions, putting their rights and freedoms—such as the right to freedom from discrimination—at risk.” Of particular concern is how data are quietly pulled from profiles and browser histories to create or exploit online identities for undisclosed purposes.

Yes, there is a presumed short-term benefit to using these practices that could result in long-term harm. Yet, as we’re about to hear, the invasive nature of ad tech is but one of a number of reasons to focus on legislation as a way to help reaffirm values, inform internal strategies, and inspire new collaborative frameworks.

[01:39] Chris Delatorre: Laura, when we spoke in May, you described Google Docs as a “frequent hurdle,” in terms of collaborators requesting to be deleted. This is a pretty big topic to cover but briefly, why is it that so many nonprofits find themselves using data or digital marketing tools in ways that may not respect their users, let alone be in line with their values?

[02:00] Laura Guzman: So, when we’re thinking about tools that organizations with social missions use to carry out their work, a key tension is that that you’re pointing to: do the means that they’re following to pursue their goals actually match their mission? That’s to say if it’s an organization that’s defending, let’s say working on defending human rights in their external work, are they using platforms or tools or that kind of thing that actually also uplift human rights?

I think it can be a pretty tricky question and one that folks get stuck in because it can feel very often like there aren’t alternatives. And as if it’s kind of this or nothing. But I would say in response to that general tension and that feeling of maybe hopelessness is that it’s most important to consider context and to weigh costs, to consider risks, and to look at other contextual factors.

So, as an example, though from a privacy perspective maybe an organization may feel like it’s most beneficial for them to store certain data or emails or contacts on infrastructure that they own, or that they have full access to and no one else does, where that might not actually be the right solution for that nonprofit. Perhaps a question of budget or ongoing technical know-how. So at that point it becomes much more of question of what makes sense in that context and for that organization than it does of saying you know let’s never use Google products and only use in-house ones. That it can quickly unravel at a certain point if you’re trying to make kind of a wholesale change that isn’t super sustainable.

There might be smaller questions to ask. You know, is it certain types of documents we can store internally, certain types of documents that we store online, and kind of going into that more minutiae can often relieve some of that tension between saying, you know, either this digital marketing tool or this digital platform or nothing else. But there’s no point in attempting to make a huge kind of context-blind switch if the organization and the context isn’t quite right for it.

[04:09] Chris Delatorre: Earlier this year, you described the GDPR as “an opportunity for civil society to get more intentional about their use of data. At their core, these are rights-respecting and privacy-protecting regulations that are powerful tools for not just individuals, but society as a whole.”

Now, whether it’s the GDPR, the California Consumer Privacy Act, or the Children’s Online Privacy Protection Rule, it’s clear we’ll be seeing more of this type of legislation in the future. You’ve talked about organizations using legislation as a framework for advocacy and research. How can this be a starting point for refining internal practices and driving external change? Building coalitions, for instance.

[04:57] Laura Guzman: So, what we’ve found in our research looking at how civil society organizations can use the GDPR and similar data protection legislation is that though it is an important—these legislations are important tools for litigation, kind of obviously it’s a bit in the name and what they’re meant to do—they’re also really powerful in other ways. So one kind of quick and replicable perhaps example is when we took a look at work done by Privacy International where they used the GDPR as a legal framework for their research, they took a look at Android apps and practices, kind of technical practices they had that didn’t respect the rights of their users. They were transmitting data to Facebook without consent and without notification.

So, Privacy International did this research again building upon what the GDPR set out as a framework but instead of pursuing litigation they brought the discrepancies, let’s call them, to put it lightly, to the attention of app developers and then to kind of broader general publics, and through this combination of media outreach and also direct outreach to the app developers—many of the apps, not all of them—actually changed their technical setup.

That’s just one example. We’ve seen other cases like what you mentioned around coalition building. There’s work being done around advertising technologies and a campaign to “fix ad tech” and that’s brought together quite a number of civil society organizations across Europe who are using the GDPR to—they’re using the GDPR to file complaints against different actors in the advertising technology ecosystem. And that’s important both from a litigation perspective and from the perspective of building this united front that’s active in a number of different countries and kind of bringing together one big look at a really pervasive technology. I could keep going for a while because there are a lot of different ways that legislation can be used that’s beyond litigation.

One other that you touched upon is internal practices and this is something that has gotten some attention because it can be a challenge, particularly for nonprofits who are smaller and perhaps less familiar with things like data protection. But bringing these conversations out into the open is generally positive and good for building awareness and for building resources so that folks can practice better things internally.

At The Engine Room, though practicing data responsibility and thinking about being ethical and thoughtful about our tech is part of our work, we still took the implementation of the GDPR as an opportunity to pause, to reflect on how we collect data, what we store, how we archive it, how we delete it, and make sure that we really were being responsible with what we did.

[07:55] Chris Delatorre: Is it possible for civil society organizations to use mainstream services and insulate themselves from the potential negatives at the same time? And if they do find that these platforms aren’t the best option, then what is the alternative? Is it better working with the “devil you know” than “the devil you don’t?”

[08:14] Laura Guzman: One thing that I think is important to remember on this point is kind of related to what I touched on earlier, and that’s that in thinking about all of this and navigating picking a tool or a platform, there’s no single right answer. I think there are critical things to think about whether that’s consent, privacy, data minimization, other principles. But what those turn into in practice can look super different for different organizations.

I’d say that nonprofits or folks listening to this can take heart in knowing a few things about that tension between the devil you know and the devil you don’t. And the first of those I’d say is that there are probably more alternative tools, services, platforms out there than one might be aware of or one might expect. And that many have really broad and diverse existing communities of support and existing users. That to say that there’s possibility and there’s opportunity out there. The second and perhaps more important thing I’d say that folks can take heart in knowing and remembering is that there are a lot of internal changes that can be made, regardless of what platform someone is using. After all, someone could let’s say build the most ideal rights-respecting platform in the world but if it’s being misused, if it’s being used for malicious ends, it’s kind of what’s the point of having it be such a well built platform.

So on the flip side, there are a lot of things and steps that can be taken internally. It isn’t necessarily just a question of let’s say should we use Salesforce or should we use CiviCRM. There can be a lot more nuanced questions in there like whose information are we storing on this database, why are we storing it, do we actually have an intended use for it, did that personal consent to us using their data that way, if we needed to or wanted to or they wanted to, could we delete their data completely? Perhaps we’re putting them in harm’s way by collecting all of this data in the same place. And those kinds of thought processes and tweaks to practices can in some ways be more empowering really than just sticking to the question of which platform should we use.

So, by thinking through all of this a bit more holistically, I think these kind of embarking upon answering this question can be a way to strengthen all of the work in an organization. It’s about identifying new ways that the means by which you’re pursuing your ends don’t just match your mission but actually strengthen your mission.

[10:59] Chris Delatorre: How can our listeners get involved? Is there a help line or listserv they can join if they have questions or want to share practices that have worked for them?

[11:07] Laura Guzman: Yeah, so I would say folks could take a look at responsibledata.io. That’s the home of the Responsible Data community. There’s a mailing list that people can join where something over a thousand practitioners or researchers, experts that kind of thing flag resources, challenges, potential solutions. There are also some resources on the site itself that might be useful to people who are navigating this question of how do I use data or technology responsibly in my work and in my social impact work specifically.

And I’ll also say that at The Engine Room we offer pro bono support to organizations who are tackling data and tech challenges and want to make sure that they’re not just using data or technology effectively but also responsibly. So, if anyone listening is interested in that light touch support, they can email hello@theengineroom.org. You can also follow us on Twitter. It’s @engnroom.

[12:10] Chris Delatorre: Laura Guzman, communications manager at The Engine Room, thank you.

Digital Impact is a program of the Digital Civil Society Lab at the Stanford Center on Philanthropy and Civil Society. Follow this and other episodes at digitalimpact.io and on Twitter @dgtlimpact with #4Q4Data.

Leave a Reply