Skip to content

The Problem with Palantir

4Q4 Podcast, Interviews

The Engine Room's Zara Rahman sheds light on a partnership between WPF and Palantir Technologies, a US company with ties to intelligence communities.

[powerpress]

Digital Impact 4Q4 Podcast: Zara Rahman on Palantir

SUBSCRIBE TO THIS PODCAST ON iTUNES.

00:00 CHRIS DELATORRE: I’m Chris Delatorre, Editor of Digital Impact, and welcome to 4Q4, a rapid-fire podcast that covers the who, what, why, and what next for emerging trends in social sector data.

I recently spoke with Zara Rahman — a researcher, writer, and linguist working at the intersection of power, technology, and social justice. Zara is Director of Programs at The Engine Room, an organization that helps the social sector make the most of data technology to push for social change. The Engine Room are community stewards of the Responsible Data community, which we’ll talk more about shortly.

I called Zara to follow up on a flurry of activity around a new deal between WFP (the World Food Program) and Palantir Technologies, a US software company that specializes in big data analytics by way of government contracts — a deal that would give it access to the confidential data of nearly a billion people in the hungriest parts of the world.

Complicating all of this is Palantir’s link to Cambridge Analytica, the political consulting firm that worked for Donald Trump’s presidential campaign in 2016 and was later found to have improperly obtained, for political purposes, the private data of nearly 90 million Facebook users without their explicit consent. I’m sure you’ve heard that part of the story.

What makes the latest chapter particularly interesting is the latticework of power and privilege that seems to be materializing around an otherwise mundane scenario of information sharing for social good.

“Would, for example, food aid recipients be comfortable knowing that their data is being managed by a company that has such strong CIA ties? It’s very difficult to know.”

News of a 5-year cross-sector data partnership between the largest humanitarian agency fighting world hunger and Silicon Valley’s go-to contractor for US spy agencies could have profound implications for the 90 million people the WFP serves.

Partnerships like this raise concerns about the rights of the people being served. The UN operates and its agencies occupy a unique role in the world and are privileged by an important “social license” to work with vulnerable people.

How are these rights protected as the agencies partner with companies whose very existence depends on mining insights from data? What does Palantir hope to gain from this venture and how can this story, as it unfolds, inform civil society organizations that are working toward data partnerships of their own? In this first episode of Digital Impact 4Q4, Zara Rahman of The Engine Room sheds light on the problem with Palantir. Zara, thanks for joining us today.

03:04 ZARA RAHMAN: Thank you for having me.

CHRIS DELATORRE: Enrica Porcari, the chief information officer and director of the technology division at WFP, issued a statement earlier this month outlining the partnership, which included a few points on how the agency says it intends to keep a lock on its data. The Responsible Data community promptly issued an open letter to David Beasley, ED of the World Food Programme, who assumed office in 2017. They replied and now some members are calling for a community call, facilitated by The Engine Room, in response to WFP’s reply to the open letter — which upon reading it doesn’t really seem to address much of what you pushed for.

It seems to me that what’s been disclosed — or in this case not disclosed — does raise a few flags. First, the partnership is shrouded in secrecy, and recent calls for transparency have been largely disregarded, I think we can all agree on that. To start, let’s give our listeners the 411 on Palantir’s role in all of this, if you could. How they would support WFP’s data and supply chains, for instance.

“We’ve yet to see any details of the process, and we’ve yet to see anything that’s concrete about the agreement that was made between them. And that is deeply worrying.”

ZARA RAHMAN: Yeah, I guess basically it comes down to a new partnership between Palantir and the United Nations World Food Programme, WFP. Palantir are, as you mentioned, a private software company. They have ties to the CIA and to intelligence communities, to defense communities, and particularly to the United States government. That all means that they have a very different reputation, business model, and overall approach to what we’d hope for someone engaging in the humanitarian space.

As you said, this week Enrica, the CIO of WFP, suggested that the Palantir deal would be focusing on helping them increase efficiency with understanding food prices and shipping logistics.

This seems to be a slightly different approach to what was mentioned before. Last week or so it was mentioned — I think during the launch — that Palantir would be working within areas like fraud detection, which is much more sensitive. I think all that I guess in some way shows how little we actually know about this partnership so far. The other huge worry that we have at the Responsible Data community is that Palantir’s work with the intelligence and defense industry — being CIA funded, partly — it has the potential to seriously damage the credibility, not just of WFP themselves but also of smaller humanitarian actors whose work is implicitly associated with WFP. In many countries, people are — sometimes rightly, sometimes not rightly — skeptical of the work that development actors do and this does not help with those reputational issues.

At the Responsible Data community as we outline our open letter, we’re seriously concerned about the due diligence that’s gone into this deal. WFP have mentioned that there was a due diligence process. We’ve yet to see any part of that, any details on the process. The lack of transparency around exactly what’s happening. There’s also the issue of the actual technical systems that Palantir are using. Palantir is offering them probo bono “in kind” work, so people who will work on those systems are free.

But also, it was announced yesterday that they’ll be offering a perennial license to WFP to use Palantir’s Foundry software. This software is — very little is know about it. It’s I guess reasonable to assume that it has encoded within it various biases. It’s impossible to know what these are, depending on what it’s been used on before, what it’s been trained on, what kinds of data models it’s been using. In short, we’ve yet to see any details of the process, and we’ve yet to see anything that’s concrete about the agreement that was made between them. And that is deeply worrying.

“we’d be hypocrites if we said we were working to protect human rights without considering protecting human rights in the data that we use.”

07:10 CHRIS DELATORRE: When you’re talking about the models, when a software company like Palantir is confronted with the question of data — are you holding onto data, etc. — they’re really not, right?

It seems more like an insidious kind of situation where they go in [and] create this proprietary software — create a layer over the internal system of the organization — and then create a model based on that.

So technically they’re not even taking the data. They’re creating a model and taking that and — so they’re exploiting the data.

ZARA RAHMAN: They’re using, they’re not holding onto any data as far as we know. They’re building software that kind of wraps around the data inside that organization and then they build models based on that data. So yeah, the models that Palantir will create will stay within the software and will stay within Palantir. It seems very important to be clear that this isn’t about Palantir owning the data or keeping the data for its own business; it’s keeping the models.

08:18 CHRIS DELATORRE: Why should the social sector or anyone interested in ethical data or human rights care?

ZARA RAHMAN: For those of us working in the social sector, doing social good is kind of in the name, it’s our main aim. For that work to be taken seriously, the methods that we use really need to respect human rights through and through. In my mind, we’d be hypocrites if we said we were working to protect human rights without considering protecting human rights in the data that we use. For example who we share it with, how we do it, what kinds of work practices we do, how exploitative or how inclusive it is, for example.

That might include thinking about how the people we’re working to serve wouldn’t want their data to be managed. Would, for example, food aid recipients be comfortable knowing that their data is being managed by a company that has such strong CIA ties? It’s very difficult to know.

There are multiple blurry issues with this deal, and I guess more broadly I also don’t want to single out WFP. There are also other companies within the sector that have made deals with Palantir and other deals that have been made with private sector tech companies. So more broadly that’s the systemic issue that we’re really interested in understanding and advocating to better responsible data practices.

The private sector is used to or accustomed to operating in a totally different context to that of humanitarian organizations. That much is clear, for example, in how Palantir referred to food aid recipients as “customers” when they were talking about their new deal. I think that framing of recipients of food aid as customers really demonstrates to me how clueless Palantir are about the deep power asymmetry that’s at hand and related concerns that have to go into designing rights-respecting technical systems.

10:09 CHRIS DELATORRE: How does this relate to the Responsible Data community in practice? You mentioned responsible data — what exactly is it and how does this relate?

ZARA RAHMAN: That’s a great question. The Responsible Data community started five years ago now as a place for those working in the social sector to come together and discuss and collaborate on best practices for using data in a responsible way. We and a bunch of other people have noticed that the discussions around security or privacy or ethics were happening in very siloed areas, so we used this umbrella term of “responsible data” to bring those concerns together so they could be addressed together in a holistic way. So, over the past five years we’ve held in-person events, developed resources, partnered with a bunch of smart organizations, including Digital Impact, to do this.

And now it’s amazing to see that there’s a whole community of people who are pushing for better responsible data practices across the whole social sector. [What’s happening with Palantir] is an example of a challenge or an event that has happened that the community has really risen up and been very vocal about. And it’s been a fantastic example really of the power of the community.

So, over just a few days since that announcement was launched last week, there have been over I think about 75 emails from all around the world. It’s been amazing to see the amount of expertise shared in a collaborative way. So now on the mailing list we have about 950 people. And to have that amount of people talking to each other and many who don’t know each other, people who are very busy giving really detailed responses, sharing work in a collaborative way, coming together to write this open letter collaboratively and edit it, was really amazing to see.

11:53 CHRIS DELATORRE: There was this one reply from someone who is working in South Sudan who is really invested in this because it directly affects what they’re doing there and so I see this sense of movement and urgency, people working together, it’s really nice.

ZARA RAHMAN: I think particularly for people who have found themselves as part of larger institutions and they see that their institution is moving in a way that means that they’re being pushed to use data in a new or different way or for something that hasn’t really happened before — their community has been a place for those people who have this feeling like, maybe this isn’t the best thing to do, that maybe there could be negative unintended consequences. Maybe we just need to take our time and go a bit slower.

We’ve really found that those people have found their place within the Responsible Data community as a place that they can express uncertainty without being worried that someone’s going to think that they’re not being smart. I think we really value people expressing their challenges and helping each other to be better because I think it’s really easy to forget sometimes that so much of this is new and that we sometimes prioritize speed and I guess as a sector over being thoughtful about stuff.

So making space for slow, intentional decision making around technology and data is really one of the goals of the Responsible Data community.

13:20 CHRIS DELATORRE: How can listeners join advocacy efforts? That’s the most important thing, right?

ZARA RAHMAN: Yeah, I mean if you’re listening I presume you’re in some way interested in the work of the social sector, so the Responsible Data mailing list is open to anyone to join, and you’ll find that linked from the site, responsibledata.io. On that site, there’s also resources that you might find helpful such as the Responsible Data Handbook. And if you’re a user of Twitter or social media, you can share resources or thoughts or articles and interesting things with #ResponsibleData.

CHRIS DELATORRE: Zara Rahman, Director of Programs at The Engine Room, thank you.

Digital Impact is a program of the Digital Civil Society Lab at the Stanford Center on Philanthropy and Civil Society. Follow this and other episodes at digitalimpact.io and on Twitter @dgtlimpact with #4Q4Data.