Skip to content

When Nobody Knows You’re a Dog: Tech, Civil Society, and the Fight for Authenticity

Opinion

Civil society organizations should guard against "authenticity theft," a troubling trend that may yet reveal a silver lining.

Authenticity is vital for civil society organizations. Ensuring that they are seen as credible and reflect the voice of the people and communities they serve is crucial when it comes to maintaining their most valuable asset: trust. But could this authenticity be under threat in our increasingly digital world?

Proving Identity

Since the birth of the internet, many have noted that the huge gains it brings — in terms of our ability to find and share information — come at the cost of certainty over provenance, accuracy, and identity. This was encapsulated in the cartoon published by The New Yorker in 1993, where two dogs sit at a computer and one tells the other “On the internet, nobody knows you’re a dog.”

Once merely a witty insight about a nascent technology, in the current online context — where children are taught the dangers of being groomed by sexual predators posing as young people; and our social media feeds are so plagued by bots, it’s hard to tell whether someone is human, let alone who they say they are — the cartoon is less of a laughing matter.

The challenges of determining identity online apply not only to individuals but also to organizations. It is not hard, for instance, for someone to impersonate a charity or create an online identity and suggest they are a representative of it. This may be an outright criminal act, as in the case of fundraising fraud, or something more insidious, like piggybacking on the organization’s legitimacy in order to influence public policy or debate.

Network Weaknesses

Technology also adds another dimension to the problem. The ability to organize effectively at scale has seen the emergence of a proliferation of online protest movements and networks. Membership of these groups is typically self-declared, and many do not have recognized “leaders.” This presents a major challenge if the aim is to confront corporate interests or governments: there is nothing to stop the agents of those opponents proclaiming themselves members (or even leaders) of the group in order to subvert or undermine it. (I wrote more about this in a recent article for HistPhil.)

An entire industry has arisen in microtargeting misinformation in order to destabilize democracy around the world.

Alternatively, those in power who want to defuse networked protest may engage in “astroturfing,” where they create online groups and movements in order to give the impression of grassroots support for alternative viewpoints on an issue.

This kind of “authenticity theft” is something that civil society organizations need to guard against as their hard-earned trust with the public becomes increasingly valuable. But it’s not just the authenticity of individuals and organizations we need to worry about. Perhaps an even bigger challenge is the authenticity of information.

The Authenticity Deficit

The online “attention economy,” in which content has become abundant and the limiting factor is our ability to pay attention to it, produces unhealthy economic incentives. Those who want to grab and hold our attention know that the best way to do so is to present the most extreme versions of everything. For the unscrupulous, truth becomes irrelevant in the quest for clicks. Perhaps the most relevant example is the rise of “fake news” and conspiracy theories.

On top of this, there are those who deliberately create falsehoods and distortion for more sinister purposes than merely generating clickbait. The scandal surrounding Facebook’s relationship with Cambridge Analytica and the impact it may have had on the 2016 US election highlighted the fact that an entire industry has arisen in microtargeting misinformation and propaganda in order to destabilize and undermine democracy around the world.

Deepfakes and Generative AI

The weapons in this misinformation war are getting more sophisticated all the time. There are now growing concerns about the potential use of artificial intelligence (AI) to generate video or audio “deepfakes” — copies that are indistinguishable from the real thing. Likewise, OpenAI’s announcement that they would not publish details of their new text-generating AI system, GPT-2, attracted a lot of attention. They feared it would be misused because it was “too good.” While cynics saw an element of PR in this, many who saw the results of the system agreed that its efforts were both impressive and worrying.

The big challenge for civil society is that if the proliferation of misinformation continues, it will further undermine the notion of objective truth and fact; and this is the fundamental basis for which many organizations advocate or campaign on issues. If everyone is simply free to cry “fake news,” or claim their own set of “alternative facts,” it will make it much harder for CSOs to speak out effectively in order to influence public debate and policymaking.

The Civil Society Response

What should civil society do? Well, part of it is simply being aware of the challenges so that we can mitigate them in our own organizations. But is that too passive? Should we instead look for ways to combat misinformation and secure online identity in order to ensure that authenticity is preserved?

There are many already working on these challenges. Organizations like Yoti the Sovrin Foundation or internet pioneer Tim Berners-Lee’s Inrupt are trying to create models for online identity that empower individuals to have ownership and control their data. Others are fighting misinformation: some are using technology for verification (like experiments using blockchain to timestamp content or AI to fact check in real-time) while others are focusing on human-centred approaches, like promoting ethical journalism or teaching news literacy to children.

Civil society organizations and funders must engage more with these sorts of initiatives. But CSOs also have an important role to play themselves. By leveraging the trust and respect they have earned offline, charities and others could act as bastions of authenticity in the online world. They could even become “oracles” — trusted entities that perform the function of verifying the legitimacy of information and other organizations.

By making this shift, we’re closer to pivoting from a situation where technology threatens the authenticity of civil society to one where civil society uses that authenticity to its advantage — and adds a powerful new dimension to its role in the world.