Skip to content

Upholding Transparency in the Age of Misinformation

4Q4 Podcast, Interviews

GlobalGiving has launched an inquiry into the Neutrality Paradox—a problem faced by platforms struggling between efficiency and preserving public trust.

Digital Impact 4Q4: Rachel Smith on the Neutrality Paradox

SUBSCRIBE TO THIS PODCAST ON iTUNES.

00:00 CHRIS DELATORRE: This is Digital Impact 4Q4, I’m Chris Delatorre. Today’s four questions are for Rachel Smith, Co-founder and Executive Director of GlobalGiving UK. As misinformation continues to spread and controversial issues become more frequent, many platforms for social good are struggling to remain efficient while maintaining the public trust.

To address ethical challenges and increasingly complex dilemmas, GlobalGiving launched a research inquiry into what it calls the Neutrality Paradox—a problem faced by tech platforms that are forced to make difficult decisions that may not demonstrate a position of neutrality and do so while upholding transparency.

The aim of the inquiry is to explore how these issues are affecting charitable giving platforms, in particular, and to find practical solutions that encourage responsible, ethical giving practices. Rachel, who leads GlobalGiving’s Evidence and Learning initiative, joins us today to share more.

01:06 CHRIS DELATORRE: Rachel, the Neutrality Paradox sounds like science fiction, like something out of Star Trek. But in fact, as you explained in a blog post last year, it’s a problem we’re all experiencing right now. What distinguishes the paradox in digital contexts and what are the implications for civil society organizations?

01:28 RACHEL SMITH: Hi, Chris. Well, this is an interesting and complex topic, as we will discover as we go through. In the context of philanthropy platforms, which is the position that GlobalGiving is coming at this from—we’re a nonprofit organization and we exist to support humanitarian purposes, really.

So the paradox really comes from these contrasting ideals, where technology on the one hand gives us spaces that are open, that are democratic, we hope, and that allow for different ideas, diverse ideas to emerge. Whilst as a philanthropy platform we also hold a huge amount of responsibility to ensure that organizations that we put forward onto our platform as valid and reliable and doing good work, impactful work, are going to be supported by people—the public or maybe institutions—that are putting a lot of trust in us. So that also means we really need to think about what happens when there are controversies that come about—about an organization or about a particular theme or topic that might indeed do harm to other people if we don’t take a stance or make a decision around it.

“We need to say that being open is possible but we also need to be dealing with problematic content and making decisions about what to do with that content.”

So, I think in our case and for civil society organizations in general, thinking about technology, we go beyond—we need to go beyond just saying, well, technology is giving us space and is giving us tools to allow connections to happen regardless, and to allow content to flow regardless of what it is—and we need to keep our standards very high. And so we’ve got some serious challenges on our hands. So on the one hand we’ve got partners, for example, working in the Middle East. They’re fighting in complex contexts and they’re actually fighting for openness and neutrality because if they are removed from platforms like ours it actually creates an even more difficult environment for them to be able to share the realities of the contexts they’re working in, to continue to maintain funding—and that actually has real life consequences.

On the other hand we have partners, for example, we were discussing with some partners in Germany who talked about the fact that their countries’ histories are indicative for them about how to pay attention to warning signs around content that may indeed be unhelpful or it may be harmful. And so in those cases some of our peers in Germany would much more quickly want to close down particular content because they’re worried about how those kinds of topics might escalate. There really is a lot of responsibility that we hold. We need to think about how to protect people, particularly those that are marginalized or particularly vulnerable, and so it isn’t ok for us to say that being open is possible with technology—we need to say being open is possible but we also need to be dealing with problematic content and making decisions about what to do with that content. So that’s the dilemma and that’s the paradox we’re facing.

04:54 CHRIS DELATORRE: Something I find challenging—and some listeners might relate to this—is understanding the difference between content moderation and censorship. Machine learning, for instance, could do so much to identify hate groups or extremist networks as they emerge. But as we’ve seen, algorithmic solutions are prone to human error. How can social good platforms design and adopt new tools that maintain integrity and an awareness of inherent bias?

05:22 RACHEL SMITH: I was struck when I was reading Lucy Bernholz’s Blueprint for 2020, and in particular picked up her thoughts around the digital landscape changing to become increasingly intersectional, you know, around digital, human rights, ethics, and I think that’s the starting point and the foundation for thinking about how tools, platforms need to develop and design themselves. So for social good platforms, we need to be building spaces for content to flow, for connections to be made, for funds to flow, for example, that are built on principles of good civil society, of social justice, of responsibility, of ethics. And it’s important for us to consider that when we’re thinking of content moderation as well.

So we shouldn’t take it lightly that we moderate content and that we do consider how a particular organization might be positioning itself when it shares information about an impact program that it’s running, for example. And we have to think about that kind of power and bias that exists within those decisions. So as a platform we are of course moderating what is posted, what is said, and in allowing—or even in disallowing content—we’re actually taking a stance about what we as an organization that runs a platform believes is fair and true.

“As we deal with each of these dilemmas or complex issues we actually find that we need to adapt, to change our responses.”

It’s a really complicated situation and if we want to think about how to build digital tools and the algorithms that sit behind these platforms responsibly, we have to go beyond thinking about the regulatory and legislative environment and definitely build for that, but also critically think about our values and also the implicit biases that we’re building into tools or decision-making processes, whether that’s machine-based decisions or whether that’s human.

FURTHER READING


As Rachel Smith suggests, misinformation and hate speech aren’t reserved for Facebook and Twitter. Here’s how social good platforms can stay one step ahead.

Platforms, funding intermediaries, philanthropists really need to take a step back and know their values before they get started on this. They need to make sure that they’re explicit about those values, and then begin to build digital solutions that actually embed those values. And I think critically we also do need to be prepared to adapt and change because environments change, our organizations evolve and we learn. As we deal with each of these dilemmas or complex issues we actually find that we need to adapt, to change our responses.

08:02 CHRIS DELATORRE: Vetting is a big part of being a philanthropy intermediary. But it’s impossible to track every connection and relationship, or to know what position an organization will take on a particular issue. What do you mean when you say philanthropy stewards must be both proactive and reactive in their vetting processes?

08:22 RACHEL SMITH: So, like many other philanthropy platforms, GlobalGiving has a robust, an up-front vetting process —a proactive vetting process, if you like—that holistically screens organizations that we might fund through our platform. So, mostly we’re doing that based on legal determinants, we’re looking at whether the organization is registered, we’re looking at the organization’s history of managing funds or its history of delivering social impact or environmental impact programs.

“That really has drawn us to the heart of the issue—how do we make decisions that affect people and do so in a responsible way?”

Where we see challenges is that obviously political, social and regulatory environments are changing ever rapidly and new information emerges. So that’s where we need to take this more reactive approach. And when I say reactive it’s not that we are unprepared. In fact, that’s partly what the Neutrality Paradox work is trying to address—the fact that actually oftentimes we have felt unprepared, and so when we need to be reactive what we need is a set of tools and principles to follow to help us to react smartly essentially.

Some of the initial research from our Neutrality Paradox inquiry has helped us to determine some categories of dilemmas. Anything and everything from how do we handle affiliations with controversial people or organizations—so that might be an organization that we have vetted as being perfectly good and a solid organization but it emerges that there are affiliations with a particular group or a person or perhaps some political links—to things like conflicting ethical and legal standards. And so what we need to be able to do is to smartly navigate through and make the decisions in a transparent way.

So I wanted to illustrate this—and it also gives a bit of insight into why GlobalGiving decided to start this inquiry into the Neutrality Paradox. An Indian organization passed our vetting processes, had demonstrated positive impact in their work, and last year they were approached—we were approached by an investigative journalist who was examining the practices of this particular organization. And here was the challenge for us. It was a legal one and an ethical one. The organization was operating within the law of India. However, arguably they didn’t meet global human rights standards in the work that they were doing. So, this case actually became the catalyst for this Neutrality Paradox work because it showed us that we often don’t have a clear set of principles and protocols that help us make these balanced decisions. So, in one sense we’ve used data and digital to create a space for this organization to raise funds and for us to complete vetting. But in another sense we were now faced with a very human dilemma about what set of principles that exist out there in the world should we use or might we use in order to make a decision about whether this organization was legitimate or not. And so that really has drawn us to the heart of the issue—how do we make decisions that affect people and do so in a responsible way?

11:40 CHRIS DELATORRE: GlobalGiving is leading this effort but you aren’t going it alone. Here’s how you frame it in your post:

“It is our goal to collaboratively develop a standard, transparent set of tools and resources that strike the right balance between openness and curation, free speech and moderation, independence and trust. They should balance corporate values and business requirements, external frameworks and internal standards.”

How are you calling on others to assist in designing solutions for the social sector at large, and what might a new set of standards look like?

12:15 RACHEL SMITH: This is an issue that we feel really deeply at GlobalGiving, and one that we would have explored anyway as an individual organization. But we wanted to explore very early on whom else might also might be facing these challenging paradoxical issues. And so rather than design behind closed doors for ourselves, we wanted to go out and really examine what the rest of the philanthropy sector might be thinking about in this regard.

So, over the past few months we have interviewed, surveyed, and procreated different solutions with predominately other social good platforms but also big foundations and other kinds of philanthropists and indeed content curating platforms at large. And through that we’ve been able to speak to many organizations and collected more than 50 examples of actual dilemmas faced by these organizations and what they did in order to deal with them. In some cases they were successful in navigating through these issues and in other cases really struggled to make decisions and perhaps even received some public scrutiny around that, around their decision making.

This has really been helpful for us because what we’ve been able to do is create an understanding, a foundation, that’s based on the reality of many organizations in this field of digital philanthropy. And so what we’ve been able to do I think is start to develop a set of standards—or what we’re calling a manifesto—which we think will be something philanthropy platforms and intermediaries can potentially commit to and use to build and guard their work. Our goals now are really to continue to develop a set of standards, to socialize that, to see what resonates and what doesn’t and to make it available to the wider sector.

In the impact of the Neutrality Paradox work, this is really not about GlobalGiving designing for itself but really about starting a conversation and continuing that conversation to a place where we’ve both got practical protocols and abilities within our organizations to be responsible in the decisions that we make—but also to push an agenda forward and to demonstrate the trust that we can have in the wider public sector because of our transparency in these processes.

So, you’ll see coming from GlobalGiving and our collaborative partners over the next few months a couple of things. This manifesto that I mentioned which will set out some shared principles of responsibility and this need for more proactive and dynamic approach to navigating issues. And you’ll also see a set of tools and resources that we are prototyping currently and testing with some of our philanthropy platform peers to help organizations to design solutions that mitigate some of these complex dilemmas. What you’ll see when that launches is that the approach is not just to wait until these dilemmas and issues come about but actually to examine your organization deeply before they even emerge. What we found really—and I think what is true to say, to conclude really—is that all organizations come at this Neutrality Paradox concept in different ways. Every organization has a different history, they have a different set of values, but there are commonalities and there is a shared commitment to ensuring that philanthropy is ethical.

If you’re interested in continuing to learn more about the Neutrality Paradox work that we’re doing, we are obviously very open to discussion and collaboration with others. And so we would welcome anyone getting in touch with us either through our Twitter handle @GlobalGiving or going to our website which is globalgiving.org. And for any listeners that are interested to follow me and my thoughts, you can follow me at @rachelgguk.

16:34 CHRIS DELATORRE: Rachel Smith, Co-founder and Executive Director of GlobalGiving UK, thank you.

Digital Impact is a program of the Digital Civil Society Lab at the Stanford Center on Philanthropy and Civil Society. Follow this and other episodes at digitalimpact.io and on Twitter @dgtlimpact with #4Q4Data.