Skip to content

Digital Impact was created by the Digital Civil Society Lab at Stanford PACS and was managed until 2024. It is no longer being updated.

Legitimizing True Safety

Virtual Roundtables

The first in a series of discussions on race, tech, and civil society examines police surveillance of Black and Brown communities in Detroit.


[ess_grid alias=”grid-4″]

Video & Transcript: Legitimizing True Safety

Part of the Race, Tech & Civil Society Event Series

In June, the Digital Civil Society Lab, the Center on Comparative Studies in Race & Ethnicity, and Digital Impact launched a series of conversations about Race, Tech, and Civil Society. What does real safety mean in a time of crisis? How do mass surveillance technologies impact Black and Brown communities? How can we prevent the criminalization of poor communities and instead advance a human-centered vision of safety that ensures thriving neighborhoods for all?

Speakers

Tawana Petty, Non-Resident Fellow at DCSL
Eric Williams, Senior Staff Attorney at Detroit Justice Center
Clare Garvie, Sr. Associate at Center on Privacy & Technology at Georgetown Law
Cierra Robson, Doctoral Candidate at Harvard University

Background

In 2016, the Detroit Police Department rolled out a real-time surveillance program called Project Green Light as a public-private partnership at eight gas stations in Detroit. The program was launched as a “real-time crime fighting” tool with camera connections monitored around the clock by police headquarters. Flashing green lights indicating the presence of these cameras are a constant reminder of this surveillance. Although the project claims its purpose is to improve neighborhood safety, after adding nearly 700 cameras and mobile devices and introducing facial recognition technology, there is no evidence the program has had that impact. Since COVID-19 hit the city, a program that is supposed to be used to “deter, identify, and solve crime,” has since been turned against Detroiters accused of violating social distancing orders. The city has increased police patrols in parks, regularly conducts flyovers and relies on the Project Green Light camera system to monitor compliance. Detroit police have already issued over 1,700 tickets at up to $1,000 per ticket to residents already struggling with poverty wages, and in a city that is 80% Black or African American.

Transcript

00:01:40 HEATHER NOELLE ROBINSON: Welcome to everyone. I’m going to start us with a couple of bits of business. This is Legitimizing True Safety, which is the first in a series of conversations on race, technology and civil society. My name is Heather Robinson. I’m the program manager for the Digital Civil Society Lab at Stanford. Before we start, I want to cover a couple of items of business. Just do some framing for our conversation today. And the first thing is to give you a note on our content. I really acknowledge the moment that we’re in. This is an absolutely profound time of grief and trauma for Black and African-American people. It’s a time when people are struggling to pursue liberation, and really pursue a vision of what an equitable society is, and doing this in the middle of an absolutely unprecedented health crisis. The totality of this is exhausting and scary and we want to acknowledge that.

As an academic center, we are completely committed to a rigorous and candid conversation. And our topic today is even more important given what’s happening. But given this if joining in our conversation on police surveillance of Black and Brown communities is just too much right now, we want to let you know that we’ll be posting a video of this conversation so you can watch it at another time. Or you’re simply welcome to join us at another event. We’re not going to be sharing any explicit or violent images related to this topic. Our event today is a conversation.

Next, I want to cover our community standards. Stanford PACS along with our partner, CCSRE and Digital Impact are really committed to providing a rigorous academic discussion and a welcoming environment for all speakers and participants. So, we will not tolerate any individuals who attempt to disrupt the conversation, use inappropriate language or harass any of the other participants and speakers. And we reserve the right to remove any individuals from the event for that. And as for our participation, we’ve really, really welcome you to participate, take a look at your screen, you should see a Q&A tool at the bottom of your screen where you can submit questions to the panel. And our communications teams are going to be live tweeting from our accounts @StanfordPACS, @StanfordCCSRE, @DigCivSoc, @dgtlimpact and you can tweet your own comments with #RaceTechCS.

Now, as we move forward into our content on our conversation, I want to pause and recognize that Stanford University stands on the Indigenous land of the Muwekma Ohlone people. Our panelists, our attendees are joining us from across the United States and across the world. And this includes the unceded lands of many people. So even though we are assembling virtually, we acknowledge that we’ve benefited from inhabiting this land. We also acknowledge the ways that our communities are built on a legacy of slavery. So, let us all take a moment to pay respects to these ancestors and their descendants in the present and the future and the land that we all stand on with a moment of silence.

We have organized this series, “Race, Tech & Civil Society,” as a joint effort between the Center for Comparative Studies in Race & Ethnicity, the Digital Civil Society Lab and Digital Impact. We’re bringing together practitioners on scholars and other experts in today’s conversations because the intersection of these perspectives is really critical to living in our society. I want to give you a little bit of information about the Digital Civil Society Lab and set the framing for our conversation today. Sorry. I just got a message about fixing my screen. One moment I’m going to stop sharing my screen. So yeah, so a little bit about the Digital Civil Society Lab. The Digital Civil Society Lab seeks to understand, inform and protect civil society in a digitally dependent world.

Our goal is to foster a thriving and independent digital civil society that’s rooted in democratic commitment to freedom of association assembly, freedom of speech and privacy. But with these freedoms, you know, our country has a really complicated and racialized and right now, completely fragile commitment to these freedoms. Our current moment is really reminding us that these freedoms can be easily threatened and infringe and really asking—leave us asking what makes us safe. So, our speaker today, Tawana Petty, who works with the Detroit Community Technology Project is an incredible poet and also a fellow with us at the lab is going to show us how a Project Green Light program in Detroit is a constant reminder of police presence and potential threat. It’s an example of the criminalization of Black and Brown communities. And we’re seeing that erupt so violently in our world. But we’ve also seen examples in the past week, I think, of where collective action can succeed. And this is in peaceful gatherings of protest, sheriffs and police chiefs laying down arms and actively listening to community members. And the message here is that we’re stronger together. So, for the digital part of the Digital Civil Society Lab, whether you’re looking at the threats to our freedoms, or the protests against those threats, our digital dependence is really pervasive of all of these things. So, it’s powering this—and a police department’s ability to surveil a community, but it also powers our ability to collectively organize and to watch the watchers.

Today we’ll be hearing from Tawana Petty, who I just mentioned. And joining her in conversation is Clare Garvie from the Center on Privacy and Technology at Georgetown Law, Cierra Robson who is a PhD candidate in sociology and social policy at Harvard University, and Eric Williams from the Detroit Justice Center. So, right now, I want to introduce my colleague, Jennifer DeVere Brody. She’s going to introduce the Center on Comparative Studies in Race & Ethnicity, who is co-hosting this event.

00:09:34 JENNIFER DEVERE BRODY: Thank you so much, Heather. And welcome to all our esteemed guests, and thank you for your work. I’m here just to say that we’re in partnership with our Stanford colleagues, particularly the Digital Civil Society Lab. We’ve been very fortunate to have a grant with them, where we host practitioner fellows, we have our first cohort this year, all doing really significant work on questions of racism, race, and technology, looking at how this new digital world can often replicate inequities. And you know, especially today’s topic around surveillance and trying to legitimize true safety, just because we have these technologies, we have to think about their impacts. It gives me great pleasure to be here with you all. I wanted to say also that our students are beneficiaries of the work we’re doing. They’re working with some of the practitioner fellows and we also started a faculty research network on these topics because of the ways in which exactly as Heather said, and we all know so well, given the past few weeks especially.

You know, we are under siege. We are not just surveilled but, you know, imperiled and we — you know, it’s hard sometimes to have these kinds of conversations in the midst of the protests when we’re all, you know, mentally exhausted, and in other ways, and yet, I think this topic is really important, and particularly in these kinds of environments. So, I really am pleased to hear from you all, and thanks to everyone in the audience for joining us. And just to finish up, the center at Stanford is the one place where we try to think about, along with maybe African-American studies, questions of racial justice and equity, and to try to make the world a better place. So, we are both an academic program with Native American Studies, Asian-American, Chicano-Latino studies and Jewish Studies, but also we’re a center that does this kind of research and sponsors, practitioners like yourselves to work with our students. So, thank you again.

00:12:09 HEATHER NOELLE ROBINSON: Thank you so much, Jennifer. And now, I want to introduce my colleague, Chris, who’s the editor of Digital Impact, to just give us a very brief introduction to Digital Impact.

00:12:20 CHRIS DELATORRE: Good morning, everyone, and welcome. Thanks for joining us. I’m Chris Delatorre, editor at Digital Impact. First, I’m sending support and solidarity to our brothers and sisters in the sector and on the ground during this challenging time. Digital Impact is a program of the Digital Civil Society Lab at Stanford PACs. Digital Impact is focused on the safe, ethical and effective use of digital resources in the social sector. We aim to make complex topics more accessible for data practitioners, policymakers and community leaders by hosting virtual conversations and sharing opinion pieces, news roundups toolkit exercises and more. All of this in order to help bridge gaps, improve policies, collaborate in ways that reduce traditional points of friction and ensure greater and more sustainable impact long term. For more about us, visit digitalimpact.io. I invite you to contribute to Digital Impact. Tune in to our 4Q4 Podcast and subscribe at digitalimpact.io/subscribe. I’m looking forward to this conversation. We’re very lucky to have these speakers today. Thanks very much. Back to you, Heather.

00:13:14 HEATHER NOELLE ROBINSON: Thanks. So Tawana, please get us started. Now is your time. Tell us about the work that you’re pursuing, Legitimizing True Safety and Project Green Light.

“It’s not just a matter of wanting to push back against cameras… It’s about systemically resisting systems that make certain people have to be tracked…traced…monitored…criminalized.”

00:13:54 TAWANA PETTY: Thank you, Heather. Thanks, everyone. So, you know, I’m joining you all with a heavy, heavy, heavy heart this morning standing on the shoulders of many ancestors before me who have carried the brunt of racial injustice in this country. I’m standing here as someone who directs a data justice program at Detroit Community Technology Project with a committed legacy of work in digital justice, digital inclusion, data justice, and equity. And I’m standing here with you as a Black mother of a Black son, who had his first racial incident at age five, where he came home from school and said to me and asked me if his skin was dirty. Because it was the first time he had been in a school that was not all Black or predominantly Black. And so, I’m coming here with many, many years of internalized pain. And I say internalized because even though I’ve been on the ground doing a lot of work for many—for a few decades now, you can never really get rid of those scars. And whenever they do scale over, they get peeled off again. And so, I just have to enter with honesty about the fragility (for me) of this moment and—but also knowing that it’s imperative to articulate this moment, it’s imperative to be in dialogue, that many ancestors before me use their platforms to make sure that the masses understood not only racial injustice, but that they understood alternative visions to those injustices, and that they brought all walks of life together to struggle against white supremacy and systems of surveillance.

And so, Project Green Light, a program in Detroit comes off a very long legacy of surveilling Black bodies. In the 1800s, there was the Lantern Laws, right, where if you were Black or Brown, I would even say Brown, and you were found not in the presence of a white person, you have to have a lit lantern in front of your face, so that you could be surveilled, so that folks could know that you weren’t a threat, so that they could see your face. And so, when I look at a city like Detroit that is 80% Black, and I walk out of my door and I see flashing green lights, pervasive scarlet letters, if you will, in front of homes that are deemed in neighborhoods that are deemed unsafe, and surveillance cameras that are connected to 24-hour police monitoring, it unearths a lot of trauma from centuries of ancestral pain. And so, it’s not just a matter of wanting to push back against cameras, or even slowing down technology. It’s about systemically resisting systems that make certain people have to be tracked, have to be traced, have to be monitored, have to be criminalized. It’s as if the imagination that exists in so many other neighborhoods and areas that don’t have Black and Brown people is gone. It’s as if city government just can’t think of any other way to reduce crime, quality of life crime, I might add, other than to fill up jails, and to trace and track and monitor.

And so, with Legitimizing True Safety, it is my hope that through not only my fellowship, but the work on the ground and the collaboration with folks at Stanford, the Justice Center, Georgetown, Center for Privacy and Technology, and Digital Impact and so many other spaces across Harvard with Sierra, and so many other places across this world that we start to not only reimagine what it means to create safety in our communities, but that we also stopped thinking about Black bodies as something to be tracked, targeted, surveilled, monitored, and criminalized. And so, I’m honored to be in dialogue with you all today. I think there’s no better time to be in dialogue. And I hope that what comes out of this discussion is a better understanding not only for our audience before ourselves, and that we can tease out some solutions at some point, or at least a scaffold towards solutions throughout this discussion. And so, I think I’ll park it there for now. But I’ll say that there are 700 Project Green Light cameras in the city of Detroit right now, as it stands, and that is up from 8 or 9, just two years ago. And so, it—and it has not created safety. I know that Eric will tell you more about that. But if community members are saying that they want to feel safe, and the very things that are being created have not proven to create safety for them, then at the very least, we have to interrogate that. And so, I want to send love to Tristan Taylor, community organizer in Detroit, who has been charged with the felony for peacefully—and I don’t—I’m not even going to say peacefully, I’m going to say, exercising a legitimate civil disobedience where no one was harmed except protesters. And he is being charged with a riot—inciting a riot because he refused to go home at a curfew that should not have been imposed in the first place. And so, I want to try to send love to him. I will try to go to my Twitter page and I want you to retweet out to have phone calls flood the Detroit Detention Center, Councilwoman Mary Sheffield and anyone else. The people that need to be watched are is the system and not individuals who are practicing their legitimate civil liberties and civil disobedience. Thank you.

00:20:53 HEATHER NOELLE ROBINSON: Thanks, Tawana. I want to cover some basics for anyone who is not as familiar with the Project Green Light program. So, this is a live real time surveillance program connecting cameras in different locations throughout Detroit to a centralized surveillance unit at the Detroit Police Department. Clare, could you give us some more details about the Project Green Light program and some of the basics of the facial surveillance system that is enabled through the program?

“Face recognition is being implemented into a system where people of color are vastly disproportionately over-surveilled. As a consequence, it will be used disproportionately on communities of color.”

00:21:35 CLARE GARVIE: Of course. First, thank you so much for the invitation to be on this panel. I’m so honored and humbled to be speaking alongside to Tawana and Eric, and Cierra today. My expertise is in face recognition technology and I came across Project Green Light through that. So back in 2017, the Detroit Police Department purchased through a million-dollar contract, the ability to detect, face recognition, including off of Project Green Light cameras. So, at a very, very basic level, face recognition is the ability to take a photo of an unknown individual, compare it to a database of known individuals and seek to identify that individual. What the Detroit Police Department purchased was the ability to do that on static images and investigations to do that on mobile devices. And also, to do it on upwards of a hundred concurrent cameras through the Project Green Light program that as Tawana mentioned is now on 700 cameras. Back when I was researching this, it was less than 500. As it — when it first started, and when it was first conceived, Project Green Light was a public-private partnership that was mostly focused on gas stations and liquor stores, businesses open late night, but it is very, very rapidly expanded to include churches and schools and public housing and private landlord-owned housing, apartment buildings and clinics and standalone pharmacies.

Face recognition was never conceived of as part of it. So, the private entities, the schools or churches who were part of Project Green Light, were never told that face recognition was going to be a part of it. We actually called and asked a handful of business owners or church leaders about this and they said had they known that face recognition would be a part of Project Green Light, they would have thought twice about joining the program. So, what are we talking about here? The risks of face recognition, I believe Eric is going to go into some of the risks with looking at Project Green Light more generally. But very, very briefly, face recognition is a form of identification that can be used as a form of surveillance. It can be used on public protests. It can be used to track somebody’s location, as they move about as they go about their day, especially if the cameras are connected to buildings such as churches or women’s health, clinics. And face recognition risks exacerbating bias. It is a tool that is being implemented into a system where people of color are vastly disproportionately over surveilled. And as a consequence, face recognition will be used disproportionately on communities of color. Pair that with the fact that these systems as in Detroit, operate predominantly on mugshot databases.

We live in a country where we vastly over arrest and over incarcerate particularly young Black men. Meaning that the people who can be identified and the people who are going to be identified or misidentified using face recognition are disproportionately going to be people of color. And then adds to that a third element, which is the fact that studies continue to show that face recognition technology performs differently, by which we mean it is more or less accurate, depending on the race, sex and age of the person being searched. This means that these systems may perform the least well, they may risk misidentifying more Black people, and people precisely on whom the technology is disproportionately going to be used. So how does this play into the idea of public safety? Well, these tools are being purchased across the country under the guise of public safety. That is the narrative under which they are purchased. What does it mean to have public safety if you’ve not informed the Public and you’ve not asked the public about whether these—they want this technology to be used on them, and what public—what safety means to them? Face recognition runs on over half of all American adults. Over half of us are enrolled in face recognition databases that are used in criminal investigations. Most of us were never asked about that.

We estimate conservatively that over a quarter of all law enforcement agencies in the country, it’s probably a lot more have access to face recognition systems. The public had no vote, had no say in whether or not to purchase that. And that’s the case with Detroit. Back in 2017, when this purchase originally took place. There was no policy in place. There was no public engagement, despite the fact that the police were quick to say this is not some super secretive technology, and yet they were not as quick to say what the technology was about, how it could be used. In fact, there were no rules around how it could be used or how it couldn’t be used.

The first policy they put in place in early 2019 said DPD may connect the face recognition system to any interface that performs live video, including cameras, drone footage, and body-worn cameras, in essence, enabling them to use face recognition on any video input that they had at their disposal. Thanks to the work of Tawana and Eric and many other folks on the ground, they have now had to change that policy. There is a lot more transparency, and the policy is far more restricted. However, I would argue that it is still not a public safety mechanism because ultimately the public has still not, the people on whom the technology is being used have not been the ones that made that decision for face recognition to be used on them. And what does that look like? What does it look like when the public gets to decide? Well, we actually have a few examples in this country of that. San Francisco. The San Francisco public had the opportunity to weigh the risks and benefits of face recognition. They decided the risks outweighed the benefits. They banned it. Somerville, Massachusetts, Cambridge, Massachusetts, other places in California. When the public has a chance to determine whether the safety element of face recognition outweighs the risks element, they decide to ban the technology. But that is a privilege that should not just be reserved to liberal coastal cities, as it has been thus far. That is a democratic right of everybody in the United States to determine to be part of the conversation about how they as a community are police — and it is no — it is nowhere as important as in Detroit, in Baltimore and other cities around this country, that will — that are disproportionately surveilled or disproportionately policed because of the racial makeup of the city.

So, the conclusion there is that public safety doesn’t exist if the public doesn’t have a voice in determining what that safety looks like. So, I would urge with face recognition with surveillance with any other new technology, if we want to determine what true safety is, it looks like a system that has been democratically decided upon by the communities on which the technology will be used. And then one more quick point, because we are in the moment, we’re in a moment where public protests is very alive and well, and I wanted to raise up a quote from the Supreme Court in 1995 in a case McIntyre vs. The Ohio Elections Commission. And that is the quote that says “Anonymity is a shield from the tyranny of the majority. It does exemplify the purpose behind the Bill of Rights, and of the First Amendment in particular, to protect unpopular individuals from retaliation at the hand of an intolerant society.” Face recognition is a tool of identification. But if we put it another way, face recognition is a tool that can strip that shield of anonymity from those who are out in the streets today and yesterday and the day before protesting the intolerances of systemic racism in this country. Face recognition essentially enables anybody who’s caught on camera in these protests around the country to be identified or misidentified using face recognition. And because we lack transparency and rules around its use, we should assume that it will be used to identify peaceful protesters or any protesters.

This is not a call to turn off cell phones, to turn off video and live streaming and to stop posting videos. In fact, I come from a transitional justice background where documenting human rights abuses is the one of the most critical elements in ensuring accountability. And we’ve already seen that. We’ve seen six Atlanta police officers charged with excessive force. Thanks to the videos of incidents. A Florida officer suspended for similar reasons. The San Jose officer is now under investigation for misconduct over the last few days. But what I guess I’m trying to say is that in order to ensure safety in a world where face recognition is a reality, we need to protect the anonymity of protesters. So please go out, take video, but look at the tools available, witness.org has really, really great tools available and resources, or how to successfully document what’s happening without exposing, without removing that shield of safety of anonymity from protesters. So, I’m going to leave it there and stay safe out there everybody.

00:32:23 HEATHER NOELLE ROBINSON Thank you, Clare. I want us — all of us — to keep this question in our minds that you’ve raised up, which is that how, you know, what does it really mean to have public safety if the public hasn’t been informed, or involved in that decision? And then also remembering these examples, as you were saying of what has happened when the public has had an input into making a decision about what that safety means? So, Eric, I want to bring you into the conversation. And can you tell us more about how the Project Green Light surveillance program has affected communities in Detroit. And tell us a little bit about your work at the Detroit Justice Center.

“Ask anyone familiar with Detroit, how would you spend $20 million to make the public safer? I guarantee you the program that now exists would not emerge from anyone’s mouth.”

00:33:10 ERIC WILLIAMS: Sure. So, the Detroit Justice Center is an organization. Our name is actually self-explanatory. Our goal is creating a more just thing, all right. And that’s not as even as straightforward as it seems because creating justice requires sort of reevaluating what we think of as Justice, right? It requires reevaluating the systems that are already in place. And I think one of the most important things to do when it comes to this kind of thing is to remember that law enforcement isn’t the same thing as public safety. It’s not, right? They are just completely different things. And that’s the most — that’s where you have to begin if you’re going to start changing how we approach these things. And you also have to realize that not only does law enforce our law enforcement and public safety not synonymous, when you ignore that, and you have a system like Project Green Light, what you end up doing is doubling down on the inequities that already exist in our system. So, Black folks are more likely to be the target of policing and the placement of those cameras, these cameras for Project Green Light is very much along those — the same sort of line we’ve already seen. Black people are more likely to be arrested for crimes. Young white people and Black people doing the same thing, Black people are more likely to be arrested, more likely to be overcharged, they are more likely to be held prior to actual trial. They are more likely to be convicted and they are more likely to receive harsher sentences.

So, I mean, you just have all these problems and policing is where it all starts, right? And as I said, policing, the whole law enforcement part is not synonymous with public safety. Once you’ve gotten that mindset, then you can really start to look at how excluding the public from this is really problematic. This is—facial recognition technology surveillance in general—is something that is being imposed on these communities. So real talk, Black folks didn’t play any role in sort of creating this technology generally speaking. If you look at the folks that were actually behind a lot of these technological developments. Black faces were in fact excluded from the training databases for facial recognition technology, which is one of the reasons why there are so many errors when it comes to dealing with people with darker skin, right? So, when you have that happening, you have like people not playing a role and how its deployed. Facial recognition technology had been up and running, as Claire mentioned for almost two years before there was any public input at all, the initial — they operated with no policies and procedures in place for that long. And when they finally did come to the Board of Police Commissioners which is the alleged oversight body, the policies and procedures were three — a total of three pages. It was the actual protest of people over that that got them to extend it and a lot of hard work by people like Tawana. So, they played no role in developing it and deploying it. They’ve — it’s been very difficult to get oversight because there’s absolutely no transparency. The public has no role that actually in where these cameras are being used, and there’s no oversight or assessment. So, for example, the city of Detroit over the last three and a half years has been in excess of $20 million of public money on Project Green Light-related programs. So that includes everything from the building to the officers, the camera, the software, it’s over $20 million.

If you were to ask anyone who’s familiar with Detroit, how would you spend $20 million to make the public safer? I guarantee you the program that now exists would not emerge from anyone’s mouth, right? And given how limited the resources are for a lot of places, particularly Detroit, for example, on urban areas, every penny that you spend on something like this is a penny you didn’t spend on something that’s been shown to actually create public safety, right? There’s no oversight, the institutions charged with providing oversight, are very weak. The Detroit Border Police Commissioner says it’s so problematic that it’s actually kind of frightening to watch Commissioner sneer at the public when it comes to make complaints about the police and they spend the entire meeting for the lauding our police chief and their efforts. And I say this as someone whose father was a cop for 20 years. My mother was a state trooper. I mean — so I have no sort of inherent animus against the police. But as recent events have shown, culture and supervision play a large role in how the community interacts with the Police Department. So, that’s real — these of your beginnings. If you really look at how detached the public was from everything associated with implementing this, you can really start to see the problems that Project Green Light creates.

00:38:56 HEATHER NOELLE ROBINSON: Yeah. So as Both Eric and Tawana and Claire have mentioned, Project Green Light was started as this public-private partnership in which, you know, businesses within Detroit signed up to have these cameras and green lights indicating those cameras installed in their businesses. And Cierra, I know that public-private partnerships is a special area of research for you. So, I’m wondering if you can tell us other examples of these kind of public safety driven, I’m using air quotes there, public-private partnerships and what’s happened with other similar programs across the country?

“It’s not police forces hiring people to create these algorithms. We’re seeing more and more of the ways in which the government is relying on multinational corporations for things like surveillance.”

00:39:39 CIERRA ROBSON: Absolutely. Thank you so much for having me here. I’m so grateful to be in conversation with you all. So, Project Green Light is by no means the only surveillance system that’s happening at a citywide level. My own research focuses on Oakland, California where a similar surveillance center was put in place and there are numerous others in New York, for example, in Seattle, they’re all over the country and quite frankly all around the world. I mean, I can think of examples in China right now. So, we have to recognize that this is not a singular incident. It’s something that is happening worldwide. But to the point of public-private collaboration, I think it’s really important to recognize that the states are not creating these technologies. It’s not police forces who are hiring people to create these algorithms. Instead, they’re contracted out to largely Silicon Valley-based firms. Things like Apple, Microsoft, Google, these multinational corporations are those that are creating the technologies that will surveil people. In Oakland, for example, almost $12 million was spent creating the surveillance center and all that money was given to a military contractor. And these were federal funds, funds from taxpayer dollars. Similarly, in New York, there is a partnership with Microsoft, the NYPD and Microsoft created the surveillance center. And actually, in that case, it’s very interesting. There’s a bit of a coordination here where, if any service — if Microsoft sells any surveillance center to any other city, New York actually receives 30% of the profit. So, this is a fundamentally different kind of partnership than the things that we’ve seen before. And we’re seeing more and more of the ways in which the government is relying on multinational corporations for things like surveillance. In the age of COVID, for example, things like — companies like Facebook and Google and Apple are taking user data and transitioning it to help with contact tracing. And so, then the question becomes, for me, one, what is profitable about surveilling people in this way? And what is at stake when we allow these things to be profitable? And also, whose safety is at the center of the existence of these technologies? Is it the companies that the government keeps paying millions of dollars to? Or, is it the people who the companies are supposed to be protecting, or the state is supposed to be protecting?

00:42:20 HEATHER NOELLE ROBINSON: Thank you so much, Cierra. I want to open up the conversation to this question that Cierra just brought up, you know, whose safety is really the priority? But then also, what are the ways that we can build a real definition of public safety that is actually effective? So Tawana or Eric, do you want to respond to those?

“There’s no evidence that [surveillance programs] create safety even as defined by the people implementing these technologies.”

00:42:51 ERIC WILLIAMS: Well, so one of the things that I would say is that when we start thinking about the safety that comes from the surveillance program, even if we analyze these on their own terms, there’s no — there’s actually no evidence that they create safety even as defined by the people who are implementing these technologies, right? So, Project Green Light is a perfect example. And just briefly, the program consists of more than just a live streaming. It also consists of the police coming through and putting additional lights on the place so that, you know, the cameras can actually see something. Placing the cameras there, the police actually increase the number — their presence at green light partners, and then the green light partners get something called “911 Prioritization.” So as an initial matter, the idea that you can pay an extra fee and get 911 Prioritization from the Detroit Public Police Department should strike everyone is problematic. But in addition, we know that if you want to deter crime, at least, improve lighting and improve lighting and increase police presence can at least deter crime. So, if you were going to see improved safety, you would have to look at what these cameras, what the crime rate was at Green Light locations, compare it to non-Green Light locations and then adjust for the improved lighting and increased police presence. And then after that, you’d be able to see whether or not the cameras have actually any impact whatsoever. DPD hasn’t done that, right? So literally, we’ve spent $20 million on a program that on its own, you know, using its own criteria, there’s no evidence that it actually works, particularly when you look at Detroit crime statistics. So, I mean, it’s clear that creating safety even as it’s posited by surveillance advocates isn’t something that a Project Green Light actually does. And then you have to take a look at, as you said, who are you trying to keep safe? And that’s a very interesting question when you allow the people — when you have a program whose locations aren’t determined by, you know, sort of, oh, we’ve done an analysis and this is where crime is, but it’s done through a public-private partnership, that should raise some real questions right there. And I know Tawana you know something. I’ve heard you speak about quite a lot.

“I don’t think there’s ever really an effort to truly create safety. There’s an effort to socially control.”

00:45:36 TAWANA PETTY: Yeah, I basically — I want to always touch and agree with what Eric said. And in addition to that, say that there’s evidence all across the globe, and even in the state of Michigan of what makes us safe, right? So, resource neighborhoods tend to be safer. Neighborhoods that have viable grocery stores. Neighborhoods that have, you know, affordable, clean, fresh water. Neighborhoods that have schools, that have, you know, don’t have dilapidated books and structure that’s falling apart. You know, neighborhoods where you can get, you know, food that isn’t coming out of a liquor store. It hasn’t been repackaged and isn’t expired. I mean — so it — you know, it takes so much imagination, as I indicated earlier, it seems, when it comes to Black and Brown Indigenous communities, and even poor, rural white communities, it takes so much imagination to figure out like what creates safety in those neighborhoods, but in neighborhoods where residents feel safe. Their common denominators are that communities have resources. And so, my argument would be that there is no effort to truly create safety in those, in Black, Brown, Indigenous, poor white communities. Because if there was, then the evidence that it showcase in all the communities around them that where community members feel safe, you would replicate that model, right? You wouldn’t pour millions upon millions upon millions of dollars into militarized police forces and mass surveillance. In addition to Detroit suffering under the weight of Project Green Light and it not reducing quality of life crimes and also not even replying to the business owners in a timely fashion that are paying the extra money for Project Green Light priority response, we’re now under the brunt of the US Attorney General’s operation Relentless Pursuit program. So just the very title of that initiative, which has now infused an additional $10 million of mass surveillance funding in collaboration with the federal government lets you know that it is literally in pursuit. It’s not operation pursuit of creating quality of life or operation pursuit of ensuring resources and empathy, it is literally titled Relentless Pursuit.

And so, I think that, you know, we just have to have a much deeper conversation about what it’s going to take for the world that is currently witnessing the brunt of folks who are tired of experiencing anti-Black racism, what is it going to take for imagination to become universal when it comes to creating true safety for folks that have been “othered” for so very long? And so yes, so I would say that I don’t think that there’s ever really an effort to truly create safety. There’s an effort to socially control, which Cierra was sort of mentioning. And we think about China and places like that, that regulate how you behave, how you learn, how you comply. And in Detroit, we’re pretty much functioning under that control and the protests last night were a hundred peaceful, nonresistant protesters were circled by hundreds and hundreds and hundreds of cops and tanks and military, other military vehicles and arrested. And their leader of their protest charged with inciting a riot when no person swung, no person threw anything, all they did was stand and refuse to comply with a curfew that was imposed on them unfairly. And finally, the city has made millions of dollars off of targeting community members with Project Green — using Project Green Light for social distancing tickets, right? So, this is when the rest of the world is showing us beaches and people on the beach and people in restaurants and folks getting haircuts and protesting to go back to work and all these things — and Detroiters were literally targeted using that surveillance technology with thousand dollar tickets in a city where the median income was $29,000 a year before COVID-19. And so, I just — there just really has to be a deep analysis around why we allow for those things to happen in those communities, why there isn’t an outcry on behalf of those communities until one of us dies. And many of us are dying and many are suffering in quiet desperation for various forms of violence that we have to speak up about.

00:51:05 HEATHER NOELLE ROBINSON: Thank you so much, Tawana. I want to throw this question also to Cierra and Clare of what are the other ways that we can create real public safety. And I’m wondering if you have seen any examples of that from other communities across the United States, other jurisdictions have, you know, communities expressed something that they want instead of this kind of surveillance. Cierra?

“People are very explicit about the things that they want and the things that they need. It’s just that governments aren’t really listening.”

00:51:38 CIERRA ROBSON: So, in Oakland, there were massive protests surrounding the construction of this surveillance center. And pretty much echoing exactly what Tawana said. I mean, people want resources, people want schools and people want parks. And these are the things that people want and feel that they — that will keep them safe. What’s frustrating to me is, at the same time that over 500 people show up to a city hall meeting, a city council meeting to protest this in Oakland, nothing changed. I mean, people have been protesting for years, and asking very explicitly for simple things, access to education, access to places where you can walk outside, and yet in San Francisco, only a hop skip away from Oakland, the same measures are done. And it doesn’t even take protest for there to be facial recognition ban citywide. And so, while I recognize that things like protest are very important for, you know, advocating for true safety in these communities, I also am very much aware of the fact that what people ask for is often distributed and oftentimes, things that we ask for in certain communities are not given, even if they’re very explicit. So, I’m a little bit disturbed by that. And I’m looking for other places to find hope in these conversations because, as Tawana mentioned, as Eric mentioned, people are very explicit about the things that they want and the things that they need. It’s just that governments aren’t really listening, and it’s almost certainly because of where they live.

00:53:34 HEATHER NOELLE ROBINSON: Clare, do you have any thoughts to share with us on this?

“The promotion of safety of the people on whom this technology is used is not part of the equation.”

00:53:38 CLARE GARVIE: I really don’t have much to add, just to put a maybe a bit of a point on another actor that’s — that we can’t forget about and that is the companies as Cierra was mentioning before. So, with face recognition, I would argue it’s a very supply driven market, where oftentimes it’s not even law enforcement that’s asking for the latest and greatest technology, it’s a company that comes to them and says, hey, you know what we can do, we can identify everybody in a crowd and scan a database of 34 million people and tell you exactly who’s in that crowd. And law enforcement would be like, hey, that sounds great. Companies go further than that, though. What we’ve seen from reviewing records, public records from law enforcement agencies across the country is that companies will — they are very involved in drafting grant applications to the federal government so that a law enforcement agency can get the money they need to purchase that company’s equipment. They will also be instrumental in drafting sole source procurement agreements, which basically make the argument to city council or whoever saying we can’t get this technology anywhere else, therefore, we can bypass competitive bidding processes and go straight to purchasing from this company, all of which benefit this — a given company’s bottom line. So, when we’re talking about public safety in no way as public safety is put off as a — is a veneer on this process, but it in no way is the driving force. The promotion of safety of the people on whom this technology is used is not part of the equation. So, what is the recommendation here? I mean, it takes an overhaul of the procurement process. How does that take place? I don’t know. But it’s as basic as who is recommending that these systems be purchased. And why do they wind up in the hands of police departments in the first instance and in large part, it’s because of a supply-driven bottom line driven industry.

“Technology really gets ahead of the law. One of the problems we have is the lack of formal recognition of a right to anonymity [and] what will happen is the police will abuse this in some way.”

00:56:12 ERIC WILLIAMS: Yeah. Let me add one other piece. And that is what happens a lot of time. But I’m saying this as a lawyer, is that technology really gets ahead of the law, right? And one of the problems that we have is the lack of formal recognition of a right to anonymity. It was actually dicta in a Supreme Court case, but there haven’t — there hasn’t been any formal declaration that that is a right that protesters have. Inevitably, what will happen is the police will abuse this in some way. And that is how we will begin to carve out what the limitations are. And it’s inevitable that it will happen. I mean, police in surveillance, I mean, you don’t have to go — you don’t have to imagine this. You can look at, you know, the Ghetto Informant Program, you can look at COINTELPRO, you can look at the actions of the NYPD in the wake of 9/11. Inevitably, this will be abused more prominently than it already has by the police. And at that point, we will begin to see rules governing the use and hopefully carving out rights for individuals. Of course, what — how that is going to look will depend on who the judges are that make those decisions. And these are judges that are elected and appointed. So, as sort of attenuated as the connection might seem, who you vote for either as judge or as the executive who will be appointing a judge is actually going to be very important. So, it is important that we make this issue prominent for elected officials so that they recognize how they vote on this is actually something people care about.

00:58:01 HEATHER NOELLE ROBINSON: Tawana, I see a drop in the chat box some information about the “Green Chairs, Not Green Lights” program. I’m wondering if you can tell us a little bit about that effort.

“‘Green Chairs, Not Green Lights’ is hoping to restore the village mentality where we are making sure that everyone in our community is taken care of.”

00:58:15 TAWANA PETTY: Yeah. So, one of the things that we learned in trying to inform the community about the harms of mass surveillance and the militarization of policing is that a lot of senior citizens particularly were saying, you know, we really need more policing in our neighborhoods. We really need these surveillance cameras, you know, in order to feel safe, and we started to have questions with them about a time where they feel safe, right? And so, I asked them, I said, with all these cameras, you know, do you feel safe? Are you feeling safer? And the answer is no. And so, when we talked about a time when they did feel safe, it was a time when they were sitting on the front porches and young people would come and help them pick peas or, you know, have a dialogue, or when neighbors would sit on the porches and watch children to school. And even in more affluent neighborhoods, where you know, now homes are being built where you drive into, straight into the garage and maybe walk in through the side door versus walking onto your front porch or sitting on your porch, or your home isn’t even built with a true front porch because there’s really no effort to keep you in the front of the house. Everything is moved to the back of the house where you may or may not see what’s happening in your neighborhood.

And so, the “Green Chairs, Not Green Lights” is symbolic because it doesn’t mean only people who have a porch that they can sit on can look out for one another. It could be a matter of looking out your window when you hear the kids play. It could be a matter of making sure that you walk down stairs and see the senior citizen that you know comes home at 6 o’clock into the home. And so, the campaign is more about making sure that we’re not watching each other, we’re not logging into apps and just reporting people who don’t bring their trash can in on time, you know, and targeting folks who aren’t complying with our versions of what a perfect human being might be, but it is more of like turning to see each other, learning who our neighbors are, and preventing crime because we’ve built relationships and we’ve taken care of each other. And so, if someone’s suffering down the street from me, then it means that I’m not going to let them go without water, so that they’re not looking for water at my house and maybe in an unhealthy way, you know what I’m saying? So, a person who’s starve, a person who doesn’t have resources, a person who is thirsty, a person who is not getting a quality education, a person who doesn’t have a viable structure over their head is going to seek refuge in the best way they know how. And a lot of times the best way they know how and under resourced disinvested in communities is through crime. And so, Green Chairs, Not Green Lights is hoping to restore the village mentality where we are looking out for each other and making sure that everyone in our community is taken care of.

01:01:27 HEATHER NOELLE ROBINSON: So, I want to throw this final question to all the panelists and also invite all of you who are watching to submit your questions, because we will have about 15 or so minutes for Q&A. But my last question is, what are the dangers of going forward with this continued approach of surveillance? You know, what are the effects of the fact that these surveillance infrastructures have been built and will continue into the future? So, what do the dangers hold? And then, are there any sources of hope to look for, for changing this and, you know, I think as Eric spoke about having a more just city, creating a more just community and more just society overall? Tawana, do you want to speak to that first?

“We don’t organize with a mindset that says, it’s not going to happen in our lifetime. We always organize to make the change in our lifetime.”

01:02:27 TAWANA PETTY: Yeah. I’ll say one danger that’s already playing out is as I mentioned, a community that is already suffering under poverty is going to be irreparably harmed by thousand-dollar tickets using Project Green Light surveillance technology. So, if you weren’t able — if you were having to choose your — choose between your medications and your water bill before and you’re getting, you know, a thousand-dollar ticket, that could set you back forever, especially during a global pandemic where your income has already been sliced in half or taken to zero. We know that millions of people in this world still haven’t gotten either a stimulus check or unemployment. And so, there are community members that are not going to recover from the tracking and the targeting that comes with being in poverty. And so, you know, I think that one, I’ll just say one of the organizers that taught me many years ago told me that we don’t organize with a mindset that says, it’s not going to happen in our lifetime. We always organize to make the change in our lifetime. And so, I’m going to say that, you know, at 43, I think I might be able to hang on a little bit longer. And so, in the years that I have left, I’m going to organize to rid us of some of these systems that are so pervasive and violent. And I start with dialogue with people who are – who have a hand in creating them.

01:04:02 HEATHER NOELLE ROBINSON: Cierra, could you give us your thoughts next?

“This paradoxical way of living really messes with your head. It makes you feel totally undervalued, and it makes you feel like you are a threat to society when you’re not.”

01:04:08 CIERRA ROBSON: Yeah, absolutely. I think one of the largest risks that I see is ideological actually. At the same time that you have constant surveillance and a kind of hyper visibility of these communities, they’re also being rendered totally invisible by so many other things. People feel — and I mean, we see this in the protests that are happening today across the country. People are not feeling heard, and people are not feeling seen and yet at the same time, they are under constant surveillance. And so, this paradoxical way of living, I think, really messes with your head. I mean, it makes you feel totally undervalued, and it makes you feel like you are a threat to society when you’re not, or at least you’re viewed that way by the people who are purporting to protect you. So that, to me, is one of the biggest risks. We’ve got an entire generation of young people who are growing up watching people die on their iPhones. And yet, the only thing that people can think about is surveilling the protests to make sure that property is not destroyed. So, for me, that’s one of the biggest risks. But I find hope in something that Tawana keeps saying over and over again, which is imagination. The only things that are going to come to fruition are the things that we imagine, the solutions that we imagine are possible. And I’m really hopeful that there are people around the world thinking about these issues. In academia, there are tons of young people and tons of people who have dedicated their entire careers to these issues. And academia has an incredible amount of privilege in policy circles, in governmental circles, and even with police for that matter. So, these things give me hope, the fact that people are talking about these issues, the fact that people are raising concerns in their academic work, trying to bring truth to power. Those things bring me hope during this time.

01:06:24 HEATHER NOELLE ROBINSON: Thanks Cierra. Eric, we go to you next.

“The biggest danger is the ease with which the surveillance infrastructure being built is shifted from one use to the next. Once built, it is almost impossible to control.”

01:06:27 ERIC WILLIAMS: I would say the biggest danger is kind of what we’re seeing already, which is the ease with which the surveillance infrastructure that’s being built is shifted from one use to the next. This was initially created as something to prevent carjackings, right? And now it’s being used to ensure social distancing. So, the — once the infrastructure is built, it is almost impossible to control. There will always be some new compelling use. And unfortunately, the way the system can be used undermines all the protections that we have in place in our system to prevent misuse, right? So, our freedom of speech is supposed to be, you know, sort of that firewall that prevents things from getting out of hand. the freedom to know what the government is doing and commenting on it and protest about it. But this infrastructure can so readily be used to undermine that, that it doesn’t take a lot to leap from where we are now to what you see in China, which is specific minority groups being herded through what are essentially official recognition checkpoints and always having to have their phone on them. It’s not that difficult to see us get there. And it’s a very small — and it’s a very small leap in part because there — the public sector always comes up with some compelling new use for the infrastructure. And at the same time, you have private entities, making us feel more comfortable with the technology. You know, it’s easier to open your phone that way, right? It’s easier to get duplicates out of your Facebook album, that way, right? It’s easier to do that. And the degree to which people have become accustomed to it presents a real danger. So, it’s important that we — it’s important that we realize as Tawana said, we’re not just fighting about something that’s, you know, 50 years in the future that we realized that we have to be fighting today. It doesn’t — It does not take much to get from here to worst case scenario.

01:09:07 HEATHER NOELLE ROBINSON: Clare, any thoughts on this?

“There is a lack of transparency…to the very individuals who are identified using this technology that’s preventing them from mounting an adequate defense in their own case.”

01:09:10 CLARE GARVIE: Well, building on what Eric just said in terms of this is not the, this is not the future with face recognition. One of my biggest concerns about it now and since 2001, and in the future, is that this technology has been used in thousands of cases across the country is when used to identify not just suspects, but witnesses and victims. But in thousands and thousands of cases, the individual arrested charged and either somebody who pled guilty because they could not afford to go through the court process, or people who are convicted, were identified using face recognition and they never knew. We have no idea how often this happens because with very few exceptions, the defense never finds out whether or not the technology was used to identify them. And this is because law enforcement considers this an investigative lead only. They say it’s an investigative lead, we will corroborate the identification using other means, but they don’t. What we’ve seen time and again is that they will do and this is a real case, face recognition was used to identify an individual who was accused of stealing a pair of socks from a target in New York. Super serious crime, obviously. And there was a witness to the crime, the security guard at the Target who was texted that one photo and asked, is this the guy that robbed your store? And the witness texted back, yes, that was the identification. That was the extent to which the face recognition match was corroborated. And yet, law enforcement and the prosecution did not feel that they needed to turn over information about face recognition over to the defense.

So, we’re talking about a technology that’s used to identify that’s used to surveil, but it is not yet turned over, there is a lack of transparency, not just to the public, but to the very individuals who are identified using this technology that’s preventing them from mounting an adequate defense in their own case. And talk about undermining true safety, the inability to mount an adequate defense against the technology that you’re never told about that is — that we know makes mistakes where there are no policies around how it can be used, how it can’t be used. That in my mind undermines the right to a fair trial. And it has been for years and it will continue to be until we adequately address how this technology — whether this technology has a place in our society, and what that place is.

In terms of is there room for optimism, I do think there is room for optimism. I started researching face recognition back in 2016. And back then, there was no information available about this. We found for public policies. We know a lot more about this technology now than we did back then. And we’ve seen dozens of jurisdictions at the state and local level introduce legislation to either ban it, to press pause on the use of the technology, or to regulate that. And that’s incredible. Even in Detroit, it doesn’t sound like a win, but we went from no policy to a policy that says we’ll use this technology on any video we have available to us to a policy that says we will not use it to surveil, we will not use it on Livestream or recorded videos. We won’t use it on mobile devices. We won’t use it to assess immigration status and we need reasonable suspicion. Is it enough? No. But is it a huge step in the right direction? I would say yes.

01:13:08 HEATHER NOELLE ROBINSON: Thank you, Clare. So, we have some excellent questions coming in from our viewer. So, I want to say thank you to everyone for submitting those. And you can continue to submit those via the Q&A feature here. So first, I want to go to this question from Scott, who asks, “Is there a challenge or an opportunity to elevate the issue of the expectation of privacy for a younger generation?” And he goes on to say, “Where the use of social media many times blurs the line between public and private life, have you seen any research on how the expectation of privacy differs among age groups, especially for the broader question of how to generate skepticism of technologies infringement upon private life in a civil society perspective?” Tawana, maybe or Cierra, any thoughts on the expectation of privacy from the younger generation?

01:14:06 CIERRA ROBSON: I just posted in the chat one resource from the Berkman Klein center called the Youth Media lab. That’s just one of very many research organizations who attempt to address this very important issue. I also think the thing that we should remember is that young people are very tech savvy, and they actually are — I think we should give them more credit than we often do regarding their ability to comprehend what should and should not go online. Even just in my Twitter and Instagram feeds in the past couple days, I’ve seen so many people, so many young people post things like, you know, block out people’s faces. If you’re going to take videos or photos at protests, make sure that you are wearing masks and covering yourself. Don’t get into any like news cameras or things like that. I think that there’s a consciousness that youth are actually very aware of that we often don’t like to give them credit for. One group that I think we should maybe, you know, have some more resources for is the older generation who may not be super used to engaging on social media or may not have had the very extensive talks that many millennials have had growing up about what is appropriate and not appropriate to post online.

“There has to be more study about [tech companies’] manipulation of our vulnerabilities. And that’s going to be an ongoing protracted struggle.”

01:15:30 TAWANA PETTY: Yeah. And I’d also like to add, you know, one thing that’s gotten my feathers ruffled lately, and it might just be my intuition, I could be wrong. But at the height of us having these dialogues like Cierra said about like, blurring faces and being careful about like your facial recognition status on Facebook as an example, like there’s a setting that you can turn off. Now, I don’t know how much you’re turning it off, but at least symbolically, you can turn it off on Facebook. But there are a lot of manipulation situations that happen with these technology companies that play into our emotions, right? So, one example is the avatar, right? And so, there are a lot of people who are very cognizant of like having their facial recognition setting off, not engaging in particular types of photo sharing. But the minute Facebook says, I can give you an avatar that looks just like your face, but if you want it to be perfect, I need you to turn your camera around and face your face into the camera so that it can give you a perfect avatar, we completely adhere to that. And so, I think that there has to be more study about their manipulation of our vulnerabilities. And that’s going to be an ongoing protracted struggle. And it actually doesn’t have age group on it at all. And so, I’m glad Cierra said that because I’ve talked to people of all ages from seeing at hell, I’ve been trained by some senior citizens in their 90s on shortcuts on a MacBook. And talk to folks of all ages who have computer literacy or don’t know a lot about these technologies. And so, I will say finally, Our Data DiscoTechs in Detroit, short for “discovering technology,” there are these like tech kind of science fairs where we bring community members together of all ages to skill share and have that dialogue. And those are just spaces that we get to learn together like what these technologies are infringing on, how we can co-create solutions, and just share in stories about the impact that data has had on so many community members. Because it isn’t just a matter of tech sharing your face, it’s going to make a decision about you based on what information it gathers, and not just you, but maybe your family members and potentially your community.

01:18:12 HEATHER NOELLE ROBINSON: Thanks, Tawana. So, our next question here, I want to throw this either Clare or Eric. Someone’s asking, “How does a surveillance program like Project Green Light compare to the Patriot Act of 2001?”

01:18:35 ERIC WILLIAMS: Well, in what respect, I guess, is what I’d want to know. I mean, so — I mean, the Patriot Act was a lot about authorizing particular kinds of government activity, right? And as a general rule, our rights, these are the government are created by carving out areas where the government cannot act, right? So, we look at the — you know, the constitution, government, you know, the government shall make no law. Generally, the way our rights are phrased. Project Green Light is different in that it comes — It’s different in two really important ways. One, because it’s voluntary, right? A lot of the — in theory, I mean, the people who are putting the cameras up are private citizens. And this is a voluntary program, though there actually some question about how voluntary it is. The restrictions on government action really don’t come into play in a lot of ways, OK, at least during the initial surveillance portion. The other part is that a lot of the actual material that’s collected, right, is initially goes to the person who voluntary, who is voluntarily participating in this program. So, there’s a minimum of government action you have what is in essence a public private-partnership operating. And so, they’re — and the only restrictions that come into play are restrictions — are basically local restrictions on the actions of the NYPD to a large extent. And those are almost nonexistent. Equally troubling is that DPD, Detroit Police Department actually provides access to these cameras, basically on demand to any other law enforcement agency. So, whether we’re talking about, you know, the FBI, ICE, I mean, Michigan State Police Department, they all have access to these cameras too through Private Green Light. So, they’re very different but they both in a lot of ways permit an intrusion into public spaces that I think in the absence of a terrorist attack, most people really wouldn’t be comfortable with.

01:21:17 HEATHER NOELLE ROBINSON: Absolutely. So, this question I think would be great for all of our panelists to talk about. This is from Toussaint who asks, “How do you envision communities and cities across the country working together to oppose facial recognition surveillance technology? What is needed to create support, solidarity, coalition across the cities? And for instance, how do we transfer the success from the San Francisco ban and apply it to other cities?” Keep in mind that there’s different realities for each of these different places. Cierra, do you want to talk about this first?

01:21:55 CIERRA ROBSON: Yeah. I really appreciate the use of the word coalition. I think that no one mechanism can create change. So not only do we need protesters on the ground, but we also need people in policy circles speaking with each other. We need people at the heads of these companies speaking with each other. So, I’m not sure that I really have the answer to this question, but one thing that I would love to hear from the rest of the panelists is, you know, how can academics use their platforms to help further things like on the ground protests and other forms of grassroots organizing?

01:22:34 HEATHER NOELLE ROBINSON: That’s a great question. Tawana, can you respond to that?

“One thing I learned about this global pandemic is that there is no shortage of ways that we can organize.”

01:22:41 TAWANA PETTY: Well, I’ll use a perfect example. I just got word that Tristan was released. So, there are these — there’s all these opportunities to leverage our networks to at the very least give phone calls, you know, push back against the systems that are unjustly criminalizing and incarcerating us for practicing our civil liberties. And so, there is — one thing I learned about this global pandemic is that there is no shortage of ways that we can organize. I’ve had to as someone with several pre-existing conditions I had to reimagine what boots on the ground meant during this moment, which was very hard, because I’m somebody who’s always on the front line of everything. But I had to reimagine what it meant in this moment. And those things have varied from donating to bill supports to making sure that information is easily accessible and digestible to folks on social media platforms to make a grocery store run, leaving a leaflet and passing out flyers.

And so, there’s just been so many ways that community members have been engaging around prior to even George Floyd’s added name to the list of police murders. But to mutual aid, and making sure that folks have food and resources and PE supplies during the pandemic. And so, yeah, I’m sorry, I was I like — I just got that word right in that moment. And I’m just like — I know people took to Twitter, people took to social media. And as evil as some of these platforms can be, when we infiltrate them with our goodness, and our goodwill, and we get folks — I see one of my comrades and fellow Digital Civil Society Lab comrades here, Julie, who you know, is on the team to make sure that Facebook becomes more accountable to community members. And so, sometimes we have to dip our toes into spaces that aren’t always the best for us. But if we didn’t have — we didn’t have organizers doing that, and people with just minds and just spirits, then they would just be evil.

“One of the contributions that academics have and should continue to make is assisting community in making government accountable… That’s why they need to remain a part of every coalition addressing these issues.”

01:25:11 ERIC WILLIAMS: Can I also say that sometimes people think of academics as working in a sort of esoteric field and being really sort of distant from what’s happening on the ground, and I think that’s a mistake. I know too many academics who are actually actively involved in who are — I mean, who are, in fact, activist as much as they are academics. And I think one of the wonderful contributions that academics have and should continue to make is assisting community in making government accountable, right? I mean, sometimes helping community phrase the question by identifying the holes and the arguments that are sort of, you know, sort of blithely given to activists to tell them to go away, oh, this is working. We have — you know, it’s working. We have proof. Well, having academics who are willing to sit there and do help us with research help us break down what’s being given to us. So, we don’t have to take it at face value, and identify the next question that should be answered and identifying what is the next issue that will arise as the consequence of a particular public policy. So, the academic community has been invaluable so far. And that’s why they need to remain a part of every coalition addressing these issues.

01:26:43 HEATHER NOELLE ROBINSON: Clare, any thoughts on building coalitions across cities and other communities?

“It’s rare these days to have a subject that cuts across political lines. To leverage unlikely allies is something that can happen if a given community or coalition is open to it.”

01:26:48 CLARE GARVIE: I think Eric hit the nail on the head there in terms of the role of academics and answering Cierra’s question as well. We’ve seen a role for — I’m with an academic institution that also focuses on advocacy. And we’ve seen our role as much as I wish it didn’t need to be done but using the privilege and stature of an academic institution to legitimize what activists on the ground have been saying for years. That was our goal with researching and writing about Detroit and about Project Green Light. I think with coalitions the single most important thing is to always let it be community-led. That academics and more national organizations may not have as a granular and comprehensive understanding of what truly is needed and what safety means for those on the ground. So that it must be — any coalition that’s formed must be community-led. What’s—since the question talked about face recognition and specifically what’s sort of novel and interesting opportunity with face recognition is that what we’ve seen in my legislative efforts is that it’s a bipartisan issue — is that it’s seen as overstepping government authority from a more libertarian perspective and being — posing risks for civil rights that speaks more to a liberal political bent. And that’s an incredible opportunity. It’s rare these days to have a subject that cuts across political lines. So, to leverage unusual allies or unlikely allies is something that can happen in this space if a given community or coalition is open to it.

01:28:48 HEATHER NOELLE ROBINSON: Thank you so much for that, Clare. That is actually an excellent note to end our discussion on. I want to sincerely thank all of our speakers today. And please attendees join me in thanking them, Tawana, Eric, Cierra and Clare. And also, thanks to the Center for Comparative Studies in Race & Ethnicity, and to Digital Impact for serving as co-hosts of today’s event. This has been an absolutely fantastic discussion. We’ll be posting the video of today’s event on our website at pacscenter.stanford.edu and on digitalimpact.io. and I really encourage all of you viewers to share this with your friends, with your colleagues, with your networks, so other people can experience this conversation. And like I mentioned at the beginning of our conversation today, we’ll be continuing this series. And the next event is titled Protecting the Black Vote during COVID-19, which is going to be on June 24th at 10:00 a.m. Pacific time, same time as this one. Please take a look at our website for more information on that. And with that, I’m wishing you all safety in this time, and again, really deep gratitude for all of your contributions.

About the Series

Race, Tech, and Civil Society: Critical Conversations for Times of Crisis explores questions rooted in our histories, impacting our present moment, and critical to our futures. The global pandemic gives new urgency to conversations about race, technology, and civil society. As we depend on digital communications for every aspect of our daily lives, who is left behind? How are technologies being used to surveil communities of color – and how do communities respond to such surveillance? Why is it critical for people impacted by technology to have a voice in how that technology is regulated and employed by governments? Join scholars, practitioners, activists, and policy experts as they explore these important issues.

Series Co-Sponsors

[ess_grid alias=”grid-2″]