GlobalGiving established a collaboration of more than 100 peers invested in this conversation. The verdict is in: neutrality doesn’t work.
I’ve spent two years exploring the concept of platform neutrality. Initially, my aim was to help GlobalGiving find a better way to address our own high-stakes platform dilemmas. For instance, how should we respond to the foreign journalist who reached out after he published an exposé about a nonprofit supported on the GlobalGiving website?
In this case, we investigated and found that the group was acting according to local laws, despite the journalist’s concerns that they might be violating international human rights standards. But we lacked a framework for deciding whether this nonprofit should be allowed to continue to fundraise on our global platform.
I quickly found out how important it is that all players in the growing digital platform economy interrogate their assumptions about neutrality. Especially those tech intermediaries or digital “matchmakers” working in the social sector, aiming to connect people and ideas to do more good. I began informal conversations with peers in 2019, which soon turned into formal interviews. I conducted literature and popular media scans, and I began to bring together stakeholders of all kinds, including the nonprofits we serve.
Today, nearly two years into my research, we’ve established a collaboration of more than 100 peers invested in this conversation. And the verdict is in: neutrality doesn’t work. Despite their attempts to be open, inclusive, and neutral, platforms and intermediaries must take opinionated stands when they face a dilemma. We first referred to this problem as the Neutrality Paradox.
Before I describing the alternatives to platform neutrality, it’s important to recognize that some of the strongest proponents of platform neutrality have been the nonprofit partners we serve. Donna Baranski-Walker from Rebuilding Alliance, an organization serving families in Palestine, has seen Palestinian organizations cut off from funding because of threats from opposing political groups. “I would rather go head-to-head with anyone who has opposing views at any time as long as we’re invited to the party,” she explains. Platform neutrality has allowed the organization to raise the money it needs for its important work. She says other platform leaders have written the organization off because it’s “messy” or “complicated” given geopolitical issues outside her control.
So, in developing a solution for platforms, we explored what it is about neutrality our partners find so important. What we need to retain, we found, is openness and inclusion. A willingness to do the work to support potentially “complicated” partners. To address all dilemmas with empathy and treat all partners with dignity and respect.
As a result, we developed a community-led response to the growing problem of platform governance.
In an October 2020 conversation hosted by the Digital Civil Society Lab, Ethan Zuckerman described the current “Facebook Logic” that governs most platforms today. In the current system, he describes, users have basically no control over what appears in their news feeds; they can flag problematic content, but they have no idea what happens from there. There are algorithms for surfacing content, but we don’t have information about how they work.
We wanted to develop an alternative to this system, starting with a new approach for addressing dilemmas. We designed and tested prototypes. We took our first prototype to groups of practitioners, including an event we co-hosted in London in February 2020, called “Curation with A Conscience.” I invited participants to engage in a values-sorting exercise—part of our first prototype. And I watched it fail.
People couldn’t agree on the priority of the values with which they’d make decisions, because language got in the way. Even if everyone prioritized “transparency,” for example, they couldn’t agree on what that meant, and how it would be applied in a real-life dilemma. So, we went back to the drawing board with our peers.
The key to a Neutrality Paradox “solution,” we came to discover, was acknowledging that company values alone don’t provide enough direction for people to navigate dilemmas. Even if employees can agree on how the language is interpreted, the rest of the stakeholders won’t likely share the same values. Instead, platforms need to establish guiding principles and shared behaviors—a working “ethos”—to navigate decisions.
How can online platforms be held accountable for not doing enough to remove harmful content? Alix Guerrier and GlobalGiving have a plan.
Informed by our community of more than 100 stakeholders, developed with Human-Centered Design, and influenced by the practice of Restorative Justice, we created an approach to curation and moderation that centers dignity, empathy, and promotes creative resolutions to tensions. It’s called Ethos.
Ethos is a set of principles and a process designed to help decision-makers incorporate all stakeholder needs into their decision processes and to find creative “third-way” resolutions. Resolutions that go beyond “keep them on the platform” and “take them off the platform” binaries. So far in testing, the Ethos prototype is extremely effective in eliciting creative resolutions that instill decision-maker confidence, and reinforce dignity, relationships, and integrity.
We’re testing it right now at GlobalGiving with two high-stakes dilemmas. We’re building out a high-fidelity prototype as we go, and we’re beginning user testing soon to understand the needs of potential users, and how we might best create tools that are easy to find and implement.
Our goal is to help digital civil society platforms, philanthropy intermediaries, and even commercial platforms better govern their communities, advance their missions, and uphold their integrity as we enter next phase of the growing platform economy.