Skip to content

Introducing a New Cyber Sleuthing Manual for Students

Four years ago, UC-Berkeley did something that no other university had done: It launched an on-campus investigations unit for students to scour the Internet for public evidence of genocide and other human rights abuses.

Alexa Koenig, the executive director of Berkeley’s Human Rights Center and one of the masterminds behind the initiative, was chasing a unique opportunity.

“A lot of nonprofits engaged in human rights work don’t have the cash or human resources to comb through, for example, the 6,000 Tweets posted every minute and 500 YouTube videos uploaded an hour,” says Koenig. Academia, she says, teems with students who have the technical know-how, social justice passion, and language skills to excel at the work. Interest was so high when Koenig co-founded the Human Rights Center’s Investigations Lab in 2016 that it started with 42 student volunteers—four times the projected number.

Digital Impact, she says, would play a key role in what would unfold.

After intensive training, the Berkeley students began to verify videos from the Syrian conflict— using open source investigation strategies to find and corroborate videos of attacks on civilians.

Awareness rose, too, of the risk to students—even from the perceived safety of their Berkeley desks.

Before long, they uncovered a video of what appeared to be a chemical weapons attack in Syria. To verify its authenticity, they geo-located the apartment building from where the attack was recorded. This created an ethical dilemma. To be effective, online investigations work needs to be transparent; in this case, disclosing the spot where the footage was recorded could put the filmmaker in harm’s way.

In a separate case, students discovered a spreadsheet with the names and other information that identified survivors of a conflict. It was clear the list’s creator did not intend, or know, that the document could be found online. Could they use it? Should they alert the spreadsheet’s owner?

Awareness rose, too, of the risk to students—even from the perceived safety of their Berkeley desks. If, for instance, an international student took on projects involving wrongdoing committed in their home country, would they or their families face retaliation? And given the graphic nature of the evidence students uncovered, what steps should the lab take to support their mental health?

Finding answers to these and other questions wasn’t easy, says Koenig. A lot of organizations and lone gumshoes were also digging through Facebook posts and YouTube videos for information to support their various causes. However, there wasn’t a manual for how to do it well, let alone consensus on how to address the ethical challenges or safety concerns.

Berkeley’s Investigations Lab had tapped experts from around the world to help their team develop skills for conducting these types of digital human rights investigations. But the Lab still lacked critical support on several thorny questions, including how to maximize physical, digital, and psychosocial security for their teams, their partners, and those affiliated with online content.

“Digital Impact was critical in helping us think through the physical, digital, and psychosocial security of our teams.”

Koenig and her colleagues set out to fill that gap. First, they turned to Digital Impact. With a $54,000 grant in 2018, they hired outside experts who helped them establish security and assessment guidelines for their own projects. That framework was then used to help inform an even more ambitious undertaking: the creation of what they hope will become a global standard for documenting online evidence of human rights violations, called the Berkeley Protocol on Digital Open Source Investigations.

Released this month, the 120-page Berkeley Protocol outlines basic principles for how to do this work and do it well. For example, the protocol offers guidelines for collecting and preserving online videos and photos, including the importance of capturing public comments. It explains the significance of understanding the motives behind a post and accounting for potential biases—including those of the investigator—and lists steps to help counter the psychosocial impacts of viewing graphic violence.

“Digital Impact was critical in helping us think through the thorny questions around this work and the physical, digital, and psychosocial security of our teams and of third parties who are on the ground in conflict zones,” she says. “These insights are helping to inform what is now a global population of open-source investigators.”

The Need for Global Standards

For Koenig, the path to the Berkeley Protocol started in 2012, the 10th anniversary of the formation of the International Criminal Court at the Hague. Looking back on the Court’s track record, Koenig and her team noticed that judges had dismissed high-profile cases at very early stages of prosecution for lack of corroborating evidence.

In response, the Human Rights Center began hosting a series of workshops with prosecutors who had worked on atrocity crimes, technologists, and officials from NGOs like Amnesty International and Human Rights Watch. “At that time, people were just beginning to mobilize across social media,” she says. When Koenig tried in 2014 to get social media companies to provide evidence directly to the Court, they demurred. “We realized then that we could harness on our own this huge quantity of publicly-available digital content for justice and accountability purposes.”

The United Nations, The New York Times, Twitter, and TikTok are among the major players investing in internal digital forensics units.

The epiphany appears to have paid off. Since launching in 2016, Berkeley’s forensics lab has helped to uncover and authenticate videos of chemical weapons attacks and other atrocities in Syria. It has contributed evidence of last year’s Khartoum massacre in Sudan, and documented this summer’s violence against Black Lives Matter protesters. In 2019, Koenig’s team worked with investigative journalist Steve Stecklow to unearth hate-filled Facebook posts targeting the Rohingya and other ethnic minorities in Myanmar. The Reuters story was part of a package that won a Pulitzer Prize for an exposé on the Rohingya genocide in Myanmar.

As the lab grows (in any given semester the lab has 50 to 75 students), so does worldwide interest in open source investigations work. According to Koenig, the United Nations, The New York Times, The Washington Post, Facebook, Twitter, and TikTok are among the major players investing in internal digital forensics units.

To Koenig, this broad recognition of the promise of digital content underscores both the urgency and importance of universal standards like the Berkeley Protocol.

This is especially critical, she says, for criminal cases—in part because, as promising as satellite images, drone footage or smartphone videos may seem for capturing perpetrators of human rights violations, courts are only beginning to determine the legality of digital evidence.

Koenig is hopeful. She says the International Criminal Court has accepted social media content as evidence in a handful of cases. One of those involved issuing an arrest warrant for a Libyan war lord accused of murder based almost entirely on seven Facebook posts that appeared to show him participating in and ordering the summary executions of more than 30 people.

Koenig understands that courts will always prefer evidence to come directly from Facebook or Twitter. “But in the international system of law,” she says, “there is dawning recognition that digital content that is collected by outside investigators, including students like ours, is critical to holding perpetrators of human rights abuses accountable.”

Berkeley was the first to marshal a student team of online detectives. Today, Koenig estimates that nearly a dozen other universities around the world have built similar units. “Thanks in large part to Stanford’s support, there is now a global network of people who have the skills to do this work and are helping others to do it, too. The seeds have been planted.”

From 2016 to 2018, Digital Impact awarded grants to research teams looking to advance the safe, equitable, and effective use of digital resources for social good. With support from the Bill & Melinda Gates Foundation, the Digital Impact Grants program awarded more than half a million dollars over three years.