Skip to content

Facing the Challenge of an Evolving Digital Civil Space

Field Notes

Access Now’s Digital Security Helpline is helping to build people-first digital infrastructures, one content moderation request at a time.

Access Now’s Digital Security Helpline provides lessons on how to build comprehensive and sustainable digital infrastructures while protecting the digital rights of the people they serve. As activist Esra’a Al Shafei suggests, funders have a responsibility to adapt to the digital needs of the civil society organizations (CSOs) they support, especially those operating in hostile environments. While this is true, CSOs themselves have a responsibility to protect the physical and digital safety of their constituents.

Access Now’s Digital Security Helpline is focused on protecting the digital wellbeing of CSOs, activists, and human rights defenders. The Helpline’s 24-hour rapid-response assistance includes working with individuals and CSOs around the world to provide emergency assistance and to help them improve their digital security practices to stay safe online.

The Helpline accounts for a wide range of circumstances and contexts. Sometimes one CSO may already have robust technical resources and capabilities but need adequate digital security expertise; while another may have established partners in the region but lack the resources and funding to meet their digital security needs.

Now more than ever, CSOs and activists around the world are coming to the Helpline with content moderation cases, asking Access Now for support. For example, when an activist is wrongfully banned or censored for posting human rights abuses to a platform, the Helpline, in limited circumstances, can assist in getting their content restored.

This also includes cases where activists are publicly disparaged online by oppressive governments or other adversaries using revenge porn, verbal harassment, or other content to harm them and their work. In these cases, the CSO or an individual activist will ask the Helpline to help remove from the platform content that’s overlooked by moderators and that poses a risk to the individual’s safety, such as doxxing, death threats, calls for violence, etc.

FURTHER READING


Access Now, on its approach to content-related cases: “We take our commitment to keeping at-risk users safe and defending their human rights online very seriously.”

In 2016, there were at least 119 reported cases involving content moderation reported to the Helpline. In 2018, there were 237. By the end of 2019, there were 307 reported cases. Today, content moderation accounts for nearly 20% of all the Helpline’s cases.

Access Now has responded to this increase in content moderation cases by revisiting its Helpline mandate. For example, cases of online harassment require the Helpline to distinguish between content that is merely offensive and that which presents more immediate threats to the client’s digital and/or physical safety. Oftentimes, even with full knowledge of the cultural and political context, these types of posts lie in a gray area. It isn’t always clear whether a request fits within Access Now’s mandate of protecting digital security—content moderation as a digital security problem isn’t as cut and dry as phishing, for example.

As the Helpline grapples with the changing technical needs of CSOs in hostile environments, questions of capacity, credibility, and ethics lie in the balance. Questions like “should we be the decision makers?” mirror the tough decisions that platforms themselves reckon with in their moderation efforts. Some platforms shirk their responsibility, leaving Access Now as a first responder. Unsurprisingly, as these cases multiply, the Helpline’s capacity is increasingly strained. As the first member of the Forum on Incident Response and Security Teams (FIRST) that focuses on digital security of civil society, Access Now bears a massive global responsibility for the protection of civil society’s digital rights.

Luckily, other organizations are stepping in to help. NGOs like Syrian Archive are making space in their mandates to contribute. For example, while Access Now focuses on security, Syrian Archive has worked specifically to preserve sensitive content proactively, creating backup copies of content which often provide visual evidence of human rights violations. These are cases where moderation algorithms remove content essential to the work of civil society and human rights advocates operating in hostile environments.

As growing activity around content moderation brings new challenges and insights, it’s clear that a “technical helpline” alone won’t cut it. However, expanding the responsibility of Access Now’s Helpline to encompass more than digital security isn’t the answer either. The Helpline is successful in part because its mandate provides clear and narrow guidance to both Access Now and its clients. Practical grassroots solutions with focused missions such as the Helpline and Syrian Archive are essential in order to confront the dangerous and diverse challenges CSOs face online, we just need more of them.