Skip to content

Digital Impact was created by the Digital Civil Society Lab at Stanford PACS and was managed until 2024. It is no longer being updated.

WFP-Palantir and the Ethics of Humanitarian Data Sharing

Field Notes, Opinion

"Organizations need to assess wider implications, risks, and unintended negative consequences."

This piece was published as “A discussion on WFP-Palantir and the ethics of humanitarian data sharing” by Linda Raftree in March 2019.

The recently announced World Food Programme (WFP) partnership with Palantir, IRIN’s article about it, reactions from the Responsible Data Forum, and WFP’s resulting statement inspired us to pull together a Technology Salon in New York City to discuss the ethics of humanitarian data sharing.

(See this crowdsourced document for more background on the WFP-Palantir partnership and resources for thinking about the ethics of data sharing. Also here is an overview of WFP’s SCOPE system for beneficiary identification, management and tracking.)

Our lead discussants were: Laura Walker McDonald, Global Alliance for Humanitarian Innovation; Mark Latonero, Research Lead for Data & Human Rights, Data & Society; Nathaniel Raymond, Jackson Institute of Global Affairs, Yale University; and Kareem Elbayar, Partnerships Manager, Centre for Humanitarian Data at the United Nations Office for the Coordination of Humanitarian Affairs. We were graciously hosted by The Gov Lab.

What Are the Concerns About Humanitarian Data Sharing and With Palantir?

Some of the initial concerns expressed by Salon participants about humanitarian data sharing included: data privacy and the permanence of data; biases in data leading to unwarranted conclusions and assumptions; loss of stakeholder engagement when humanitarians move to big data and techno-centric approaches; low awareness and poor practices across humanitarian organizations on data privacy and security; tensions between security of data and utility of data; validity and reliability of data; lack of clarity about the true purposes of data sharing; the practice of ‘ethics outsourcing’ (testing things in places where there is a perceived ‘lower ethical standard;’ and less accountability); use of humanitarian data to target and harm aid recipients; disempowerment and extractive approaches to data; lack of checks and balances for safe and productive data sharing; difficulty of securing meaningful consent; and the links between data and surveillance by malicious actors, governments, private sector, military or intelligence agencies.

Bringing in a range of multidisciplinary expertise and distributed intelligence is necessary in a complex information environment.

Palantir’s relationships and work with police, the CIA, ICE, the NSA, the US military and wider intelligence community are one of the main concerns about this partnership. Some ask whether a company can legitimately serve philanthropy, development, social, human rights and humanitarian sectors while also serving the military and intelligence communities and whether it is ethical for those in the former to engage in partnerships with companies who serve the latter. Others ask if WFP and others who partner with Palantir are fully aware of the company’s background, and if so, why these partnerships have been able to pass through due diligence processes. Yet others wonder if a company like Palantir can be trusted, given its background.

Below is a summary of the key points of the discussion, which happened on February 28, 2019. (Technology Salons are Chatham House affairs, so I have not attributed quotes in this post.)

Why Were We Surprised by This Partnership/Type of Partnership?

Our first discussant asked why this partnership was a surprise to many. He emphasized the importance of stakeholder conversations, transparency, and wider engagement in the lead-up to these kinds of partnerships. “And I don’t mean in order to warm critics up to the idea, but rather to create a safe and trusted ecosystem. Feedback and accountability are really key to this.” He also highlighted that humanitarian organizations are not experts in advanced technologies and that it’s normal for them to bring in experts in areas that are not their forte.

However, we need to remember that tech companies are not experts in humanitarian work and put the proper checks and balances in place. Bringing in a range of multidisciplinary expertise and distributed intelligence is necessary in a complex information environment. One possible approach is creating technology advisory boards. Another way to ensure more transparency and accountability is to conduct a human rights impact assessment. The next year will be a major test for these kinds of partnerships, given the growing concerns, he said.

One Salon participant said that the fact that the humanitarian sector engages in partnerships with the private sector is not a surprise at all, as the sector has worked through Public-Private Partnerships (PPPs) for several years now and they can bring huge value. The surprise is that WFP chose Palantir as the partner. “They are not the only option, so why pick them?” Another person shared that the WFP partnership went through a full legal review, and so it was not a surprise to everyone. However, communication around the partnership was not well planned or thought out and the process was not transparent and open.

Others pointed out that although a legal review covers some bases, it does not assess the potential negative social impact or risk to ‘beneficiaries.’ For some the biggest surprise was WFP’s own surprise at the pushback on this particular partnership and its unsatisfactory reaction to the concerns raised about it. The response from responsible data advocates and the press attention to the WFP-Palantir partnership might be a turning point for the sector to encourage more awareness of the risks in working with certain types of companies. As many noted, this is not only a problem for WFP, it’s something that plagues the wider sector and needs to be addressed urgently.

Organizations Need Think Beyond Reputational Harm and Consider Harm to Beneficiaries

“We spend too much time focusing on avoiding risk to institutions and too little time thinking about how to mitigate risk to beneficiaries,” said one person. WFP, for example, has some of the best policies and procedures out there, yet this partnership still passed their internal test. That is a scary thought, because it implies that other agencies who have weaker policies might be agreeing to even more risky partnerships. Are these policies and risk assessments, then, covering all the different types of risk that need consideration?

Many at the Salon felt that due diligence and partnership policies focus almost exclusively on organizational and reputational risk with very little attention to the risk that vulnerable populations might face. It’s not just a question of having policies, however, said one person. “Look at the Oxfam Safeguarding situation. Oxfam had some of the best safeguarding policies, yet there were egregious violations that were not addressed by having a policy. It’s a question of power and how decisions get made, and where decision-making power lies and who is involved and listened to.”

This type of harm goes beyond the fear of exploitation of the data in WFP’s “data lake.”

(Note: one person contacted me pre-Salon to say that there was pushback by WFP country-level representatives about the Palantir partnership, but that it still went ahead. This brings up the same issue of decision-making power, and who has power to decide on these partnerships and why are voices from the frontlines not being heard? Additionally, are those whose data is captured and put into these large data systems ever consulted about what they think?)

Organizations Need to Assess Wider Implications, Risks, and Unintended Negative Consequences

It’s not only WFP that is putting information into SCOPE, said one person. “Food insecure people have no choice about whether to provide their data if they wish to receive food.” Thus, the question of truly ‘informed consent’ arises. Implementing partners don’t have a lot of choice either, he said. “Implementing agencies are forced to input beneficiary data into SCOPE if they want to work in particular zones or countries.”

PODCAST AND TRANSCRIPT

Zara Rahman of The Engine Room sheds light on the data partnership between WFP and Palantir.

This means that WFP’s systems and partnerships have an impact on the entire humanitarian community, and therefore these partnerships and systems need to be more broadly consulted about with the wider sector. The optical and reputational impact to organizations aside from WFP is significant, as they may disagree with the Palantir partnership but they are now associated with it by default. This type of harm goes beyond the fear of exploitation of the data in WFP’s “data lake.” It becomes a risk to personnel on the ground who are then seen as collaborating with a CIA contractor by putting beneficiary biometric data into SCOPE. This can also deter food-insecure people from accessing benefits.

Additionally, association with CIA or US military has led to humanitarian agencies and workers being targeted, attacked and killed. That is all in addition to the question on whether these kinds of partnerships violate humanitarian principles, such as that of impartiality.

“It’s critical to understand the role of rumor in humanitarian contexts,” said one discussant. “Affected populations are trying to figure out what is happening and there is often a lot of rumor going around.” So, if Palantir has a reputation for giving data to the CIA, people may hear about that and then be afraid to access services for fear of having their data given to the CIA. This can lead to retaliation against humanitarians and humanitarian organizations and escalate their risk of operating. Risk assessments need to go beyond the typical areas of reputation or financial risk. We also need to think about how these partnerships can affect humanitarian access and community trust and how rumors can have wide ripple effects.

The whole sector needs to put better due diligence systems in place. As it is now, noted one person, often it’s someone who doesn’t know much about data who writes up a short summary of the partnership, and there is limited review. “We’ve been struggling for 10 years to get our offices to use data. Now we’re in a situation where they’re just picking up a bunch of data and handing it over to private companies.”

UN Immunities and Privileges Lead to a Lack of Accountability

The fact that UN agencies have immunities and privileges, means that laws such as the EU’s General Data Protection Regulation (GDPR) do not apply to them and they are left to self-regulate. Additionally, there is no common agreement among UN Agencies on how GDPR applies, and each UN agency interprets it on their own. As one person noted “There is a troubling sense of exceptionalism and lack of accountability in some of these agencies because ‘a beneficiary cannot take me to court.’” An interesting point, however, is that while UN agencies are immune, those contracted as their data processors are not immune — so data processors beware!

Demographically Identifiable Information (DII) Can Lead to Serious Group Harm

The WFP has stated that personally identifiable information (PII) is not technically accessible to Palantir via this partnership. However, some at the Salon consider that the WFP failed in their statement about the partnership when they used the absence of PII as a defense. Demographically Identifiable Information (DII) and the activity patterns that are visible even in commodity data can be extrapolated as training data for future data modeling.

“This is prospective modeling of action-based intelligence patterns as part of multiple screeners of intel,” said one discussant. He went on to explain that privacy discussions have moved from centering on property rights in the 19th Century, to individual rights in the 20th Century, to group rights in the 21st Century. We can use existing laws to emphasize protection of groups and to highlight the risks of DII leading to group harm, he said, as there are well-known cases that exemplify the notion of group harms (Plessy v Ferguson, Brown v Board of Education).

In terms of responsible data approaches, research shows that organizations are completely overwhelmed.

Even in logistics data (which is the kind of data that WFP says Palantir will access) that contains no PII, it’s very simple to identify groups. “I can look at supply chain information and tell you where there are lactating mothers. If you don’t want refugees to give birth in the country they have arrived to, this information can be used for targeting.”

Many in the Sector Do Not Trust a Company Like Palantir

Though it is not clear who was in the room when WFP made the decision to partner with Palantir, the overall sector has concerns that the people making these decisions are not assessing partnerships from all angles: legal, privacy, programmatic, ethical, data use and management, social, protection, etc. Technologists and humanitarian practitioners are often not included in making these decisions, said one participant. “It’s the people with MBAs. They trust a tech company to say ‘this is secure’ but they don’t have the expertise to actually know that. Not to mention that yes, something might be secure, but maybe it’s not ethical. Senior people are signing off without having a full view. We need a range of skill sets reviewing these kinds of partnerships and investments.”

Another question arises: What happens when there is scope creep? Is Palantir in essence “grooming” the sector to then abuse data it accesses once it’s trusted and “allowed in”? Others pointed out that the grooming has already happened and Palantir is already on the inside. They first began partnering with the sector via the Clinton Global Initiative meetings back in 2013 and they are very active at World Economic Forum meetings. “This is not something coming out of the Trump administration, it was happening long before that,” said one person, and the company is already “in.” Another person said “Palantir lobbied their way into this, and they’ve gotten past the point of reputational challenge.” Palantir has approached many humanitarian agencies, including all the UN agencies, added a third person. Now that they have secured this contract with the WFP, the door to future work with a lot of other agencies is open and this is very concerning.

We’re in a New Political Economy: Data Brokerage

“Humanitarians have lost their Geneva values and embraced Silicon Valley values” said one discussant. They are becoming data brokers within a colonial data paradigm. “We are making decisions in hierarchies of power, often extralegally,” he said. “We make decisions about other people’s data without their involvement, and we need to be asking: is it humanitarian to commodify for monetary or reasons of value the data of beneficiaries? When is it ethical to trade beneficiary data for something of value?” Another raised the issue of incentives. “Where are the incentives stacked? There is no incentive to treat beneficiaries better. All the incentives are on efficiency and scale and attracting donors.”

Can This Example Push the Wider Sector to Do Better?

One participant hoped there could be a net gain out of the WFP-Palantir case. “It’s a bad situation. But it’s a reckoning for the whole space. Most agencies don’t have these checks and balances in place. But people are waking up to it in a serious way. There’s an opportunity to step into. It’s hard inside of bureaucratic organizations, but it’s definitely an opportunity to start doing better.”

Another said that we need more transparency across the sector on these partnerships. “What is our process for evaluating something like this? Let’s just be transparent. We need to get these data partnership policies into the open. WFP could have simply said ‘here is our process’. But they didn’t. We should be working with an open and transparent model.” Overall, there is a serious lack of clarity on what data sharing agreements look like across the sector. One person attending the Salon said that their organization has been trying to understand current practice with regard to data sharing, and it’s been very difficult to get any examples, even redacted ones.

What Needs to Happen?

In closing we discussed what needs to happen next. One person noted that in her research on Responsible Data, she found a total lack of capacity in terms of technology at non-profit organizations. “It’s the Economist Syndrome. Someone’s boss reads something on the bus and decides they need a blockchain,” someone quipped. In terms of responsible data approaches, research shows that organizations are completely overwhelmed. “They are keeping silent about their low capacity out of fear they will face consequences,” said one person, “and with GDPR, even more so”. At the wider level, we are still focusing on PII as the issue without considering DII and group rights, and this is a mistake, said another.

JOIN A TECH SALON

Salons happen in several cities around the world. To host a Salon, suggest a topic, or support Salons in NYC, contact the author.

Organizations have very low capacity, and we are siloed. “Program officers do not have tech capacity. Tech people are kept in offices or ‘labs’ on their own and there is not a lot of porosity. We need protection advisors, lawyers, digital safety advisors, data protection officers, information management specialists, IT all around the table for this,” noted one discussant. Also, she said, though we do need principles and standards, it’s important that organizations adapt these so that they are their own principles and standards. “We need to adapt these boiler plate standards to our organizations. This has to happen based on our own organizational values. Not everyone is rights-based, not everyone is humanitarian.” So organizations need to take the time to review and adapt standards, policies and procedures to their own vision and mission and to their own situations, contexts and operations and to generate awareness and buy-in.

In conclusion, she said, “if you are not being responsible with data, you are already violating your existing values and codes. Responsible Data is already in your values, it’s a question of living it.”