Skip to content

Digital Impact was created by the Digital Civil Society Lab at Stanford PACS and was managed until 2024. It is no longer being updated.

Nonprofit Communications

Ripping off the digital band-aid

This worksheet is intended to help you center equity and data responsibility in your evaluation designs. It is structured to help you see the implications for equity and the need to consider data safety across the entire data life cycle.

Policy Brief: What Civil Society & Nonprofits Can Do

Harvard’s Technology and Social Change Project (TaSC) developed six strategies for countering misinformation and hate speech online. They are:

  1. Connected Communities: Simply organizing an information campaign to correct or combat disinformation is not an effective strategy as it “undercuts the possibility for communities to collectively learn, counter, and adapt to disinformation.” Instead, any response should be grounded in facts about the impact and harms of different disinformation trends.
  2. Fact/Fallacy/Fact or “The Truth Sandwich”: Disinformation is typically spread in short, memorable, and pithy slogans (think “vaccines cause autism”). To rebuke these statements, you need to interrupt the impulse to remember something so ‘sticky’ AND replace it with something true. The truth sandwich model does this by replacing the disinformation with a fact or by highlighting the political agenda of the group pushing the lies.
    1. Example: Replace with fact: “Vaccines don’t cause autism. While the myth persists that vaccines cause this condition, doctors across the globe have proven that vaccines do not cause autism and are a benefit the whole of society.”
    2. Example: Highlight the political agenda: “Vaccines don’t cause autism. This myth is perpetuated by anti-vaccine activists and does not line up with scientific facts about public health. Doctors across the globe have proven that vaccines do not cause autism and are a benefit the whole society.”
  3. Prebunking: Mis/disinformation is often predictable. Prebunking is an offensive strategy where you anticipate what false information is likely to be repeated by politicians, pundits, or provocateurs at key events, and prepare a response based on fact-checks.
  4. Distributed Debunking: Battling it out with disinformation spreaders online often makes mis/disinformation gain more traction within search and trending algorithms because these technologies cannot tell the difference between truth and lies. When misinformation becomes mainstream and triggered responses from key figures (politicians, newsworthy groups, etc.), then an organized, strategic response is often necessary. Debunks should Upgrade Initiative Mar 2021 2 include a link to a reputable source and follow the models of the “truth sandwich” or “humor over rumor.”
  5. Localize the Context: Disinformation is always local and civil society organizations are the best placed to provide context. When debunking misinformation or disinformation, it is important to bear the local community in mind and to share insights with journalists. Knowing and misinformation/disinformation impacts a community, online and offline, is critical information for journalists covering a particular beat.
  6. Humor over Rumor: Misinformation tends to trigger emotional reactions and confirmation bias. It thrives in environments that are filled with outrage, fear, and anger, especially when these environments share the same political and/or cultural views (a.k.a. echo chambers). Humor over rumor is a community strategy: humorous rebuttals of misinformation attach themselves to the misinformation so that they are found everywhere the rumor is spreading. In short, by making fact-checks funny, they are more likely to go viral.
  7. Do Not Retweet: social media platforms are specifically built to incentivize arguing and the polarization of users. The more you retweet or share a “hot take” or hate post, the more it is amplified (or goes “viral”)—regardless of whether you are sharing to promote or sharing to argue against. To avoid increasing the virality of disinformation, do not share or retweet; take a screen-shot of the incorrect post, article, or tweet, then caption it with a “truth sandwich” and post.
  8. Listen to the Canaries: Research shows that women—particularly, black women—are more likely to raise red flags warning of hateful trends, censorship, mis/disinformation online than any other demographic. Indeed, censored, marginalized, and/or impacted communities are often already responding to the threat or problem, long before external parties turn their attention to the issue. A prime example of this is the movement—spearheaded by black women in the United States—to “out” digital blackface via the hashtag #YourSlipIsShowing. Helping to promote grassroots efforts to protect an impacted community is often the best way to raise awareness and to neutralize the impact of mis/disinformation from the ground up.

Misinformation Scenarios

COVID-19

Family Care is a small nonprofit founded by a local community organizer, Khalid, that specializes in providing free, basic medical care and family planning for the local community.

Organizational Structure

Equal Futures (EF) is a large, well-known nonprofit whose mission is to advise and support small and/or new nonprofits on establishing themselves and meeting their yearly targets. In their weekly blog post, EF publishes articles advising community groups or smaller nonprofits how to best manage their resources—particularly their funding—in their philanthropic projects.

Close Reading Exercise

We estimate that these exercises will take about 30 minutes total to complete.


Additional Resources

The Stanford Digital Civil Society Lab curated these resources for those interested in taking action about AI, digital systems, and data collection. They are intended to provide a range of opportunities. You can suggest additional resources by contacting us.

Take Action

  • Disinformation Toolkit
    This toolkit designed by and for international NGOs and civil society helps organizations identify their risk, develop a strategy, and build resilience.
  • Essential Guide to Verifying Online Information
    Part of the First Draft CrossCheck Initiative, this guide has tools, tips, and techniques for figuring out if online materials are what they purport to be.
  • Shorenstein Center on Media, Politics and Public Policy
    Led by Dr. Joan Donovan, the Shorenstein Center leads the field in examining Internet and technology studies, online extremism, media manipulation, and disinformation campaigns.
  • COVID-10 Misinformation and Black Communities
    Brandi Collins-Dexter, a visiting fellow at the Shorenstein Center, writes about disinformation and coordinated attacks on Black technoculture. As a senior campaign director at Color Of Change (COC), her work involves interrogating the role of media, technology, and information integrity in improving or deteriorating community health and economic opportunities.
  • Disinformation Action Lab at Data & Society (DAL)
    This research lab forges new approaches to address the complex dynamics underpinning the spread of propaganda and disinformation.
  • The 101 of Disinformation Detection
    This starter kit provides the basic steps organizations should follow to begin tracking online disinformation, and includes helpful graphics and thoughtful explorations of the pros and cons of data collection.
  • The COMPROP Navigator
    The Project on Computational Propaganda launched an interactive resource for civil society groups as they respond to a rise in disinformation.
  • Inspecting Algorithms in Social Media Platforms
    Ada Lovelace Institute’s joint briefing with Reset gives recommendations toward a practical way forward for regulatory inspection of algorithms. Primarily aimed at policymakers, the report could also be helpful for organizations thinking about methods, skills, and capacity for inspecting algorithmic systems.
  • MediaJustice
    MediaJustice fights for racial, economic, and gender justice in a digital age. Their report on digital cultures is an invaluable resource to address misinformation and information access through a lens of racial justice.
  • Berkeley Protocol
    A practical guide on the effective use of digital open source information in investigating violations of international criminal, human rights, and humanitarian law. The Human Rights Center at UC Berkeley works with technologists and students to
    improve the quality of information found on social media, including developing protocols for better identifying misinformation. Included is an online investigation plan template, a digital threat and risk assessment template, a digital landscape
    assessment template, a data collection form, and considerations for validating new tools.

Trainings and Resources from Other Organizations and Alliances

  • First Draft
    First Draft provides trainings for journalists with the aim of protecting communities from harmful misinformation, and empowering society with the knowledge and tools needed to outsmart mis- and disinformation.
  • School of Information Center for Social Media Responsibility
    This University of Michigan program offers online explainers, and more.
  • Tow Center for Digital Journalism
  • Part of the Columbia Journalism School, this program examines digital journalism’s cultural shifts and its relationship with the broader, constantly changing world of technology.
  • Shorenstein Center on Media, Politics and Public Policy
    This Harvard Kennedy School research center is dedicated to exploring and illuminating the intersection of press, politics, and public policy.
  • Alternative Regulatory Responses to Misinformation
    Yale Law School/Wikimedia Initiative on Intermediaries and Information focusing on novel regulatory responses to misinformation. Moderator Michael Karanicolas is joined by panelists Akriti Gaur, Ivar Hartmann, Barbora Bukovská, Lisa H. Macpherson, Jonathan Obar, Sandra Cortesi, Stephen LePorte, Jan Rydzak, Elizabeth Renieris, Dunstan Allison-Hope, and Jack Balkin.

Learn More