Backlash forces UCLA to abandon plans for facial recognition surveillance on campus
Before the administration reversed its position, we tested facial recognition on UCLA sports teams and faculty. The results were disturbing.
UCLA is known for its strong academics and winning sports teams. But the school almost became known for something far more sinister: as the first university in the United States actively planning to use facial recognition surveillance on campus.
Today, in a major victory for the movement against facial recognition, our Deputy Director Evan Greer received a statement directly from the UCLA’s Administrative Vice Chancellor saying that the school is abandoning its plan in the face of community backlash in the lead up to a national day of action on March 2 to ban facial recognition from campus.
“We are beyond excited by the potential agenda-setting a top school like UCLA might bring about nationwide through the prohibition of facial recognition software and through listening to the students, workers, faculty and local community,” said Matthew William Richard, a 3rd year Political Science Major at UCLA and Vice-Chair of the Campus Safety Alliance, “We hope other universities see that they will not get away with these policies. We hope UCLA continues their progress by specifically codifying this prohibition of software and creating infrastructure to legitimize our voices going forward; together we can demilitarize and democratize our campuses.”
“Facial recognition has no place on college campuses,” added Evan Greer, deputy director of Fight for the Future, “Let this be a warning to other schools: if you think you can get away with experimenting on your students and employees with this invasive technology. You’re wrong. We won’t stop organizing until facial recognition is banned on every campus.”
Here’s the back story of how this happened:
UCLA’s administration had proposed using invasive biometric surveillance technology in the school’s CCTV cameras, constantly scanning the faces of everyone on campus and checking them against a database. Students at UCLA fought back. But the administration tried to assuage fears by claiming the technology would be used in “limited” ways. The school’s position was in stark contrast to more than 50 prominent institutions, including MIT, Harvard, Brown, and Columbia, who had already issued statements confirming they have no plans to use facial recognition.
Fight for the Future and Students for a Sensible Drug Policy have been running a nationwide campaign to stop facial recognition from spreading to college campuses. To illustrate just how dangerous and discriminatory UCLA’s proposal was, we ran a test. We used Amazon’s commercially available facial recognition software, Rekognition, to scan publicly available photos of UCLA’s basketball and football teams, as well as a number of faculty, and compare them to a mugshot database. We began working with a major news outlet to release our findings. That outlet reached out to UCLA’s administrators, and we received the statement from them less than 24 hours later.
The results were disturbing — but predictable. We scanned over 400 faces, of which 58 photos of student athletes or UCLA faculty members were falsely matched with images from the mugshot database. The vast majority of incorrect matches were of people of color. In many cases, the software matched two individuals who had almost nothing in common beyond their race, and claimed they were the same person with “100% confidence.” UCLA Chancellor Gene Block, who is white, was not matched to a mugshot photo.
This is unsurprising. Even the best facial recognition algorithms exhibit systemic racial and gender bias, which is why security experts and dozens of civil society groups have called for this type of surveillance technology to be banned outright.
What would these mismatches mean in practice? If used on campus, this technology would lead to a range of consequences with varying levels of severity: students could be unable to get into their dorms or other campus buildings; students might be incorrectly marked as ‘absent’ from a class they attended; a student or staff member’s image could be incorrectly matched with a photo of someone marked as a threat, which could result in traumatic interactions with law enforcement or even false arrest.
“Facial recognition has no place on college campuses,” said Kimberlé Crenshaw, a prominent UCLA law professor who was incorrectly matched with a mugshot photo in our test, “I’m glad the administration listened to the community and is abandoning this plan. Other school administrators should follow suit. Racially biased surveillance does not make our communities safer.”
A study by the Daily Bruin showed that UCLA police already disproportionately stop, search, and arrest black and latino people. Using facial recognition on campus would have automated and exacerbated existing discrimination. Beyond students and faculty, many university employees are people of color at high risk for misidentification if biometric surveillance is used at UCLA. And the perspectives of food service and custodial staff are often ignored by administrations.
Rashad Robinson, President, Color Of Change said, “These findings illustrate the danger facial recognition technology poses for Black people on college campuses nation-wide. While numerous universities are looking to deploy this technology on their campuses, study after study, including the latest from Fight for the Future, has confirmed that racial bias is a feature, not a bug, of facial recognition technology. Schools who insist on using these faulty systems, after repeated proof that they are ineffective, should be held accountable for knowingly choosing to put Black people in harms way. It’s time for a federal moratorium to protect our communities from this new tool for racist mass incarceration and encourage more productive investments in Black safety and freedom. We applaud UCLA’s decision to follow the recommendations of advocates and students to keep Black students and staff safe by canceling their plans to introduce facial recognition technology on campus.”
Even if facial recognition algorithms improve in the future, this technology is inherently dangerous. The biometric data collected is a target for hackers and stalkers, and many schools are ill-equipped to safeguard this data. In the wrong hands, these systems, and the data they generate, could be used to harm students, faculty, and staff.
The potential for abuse is staggering. Using facial recognition, an administrator could track someone everywhere they go on campus. Students: this means they could know where you party, who you associate with, who you hook up with, what protests you go to, what you do outside of class.
Companies that sell facial recognition software are aggressively marketing their technology to colleges and universities, claiming that it can be used for all sorts of things from campus surveillance to accessing buildings and meal plans to taking attendance to determining whether students are paying attention in lectures. There is little evidence to support their claims.
More than 40 civil liberties, racial justice, immigration, and consumer protection groups have issued an open letter to university administrators calling on them to commit to not using facial recognition on campus. Thousands of students, faculty, and alumni have signed our petition. Student groups across the country are planning a national day of action on March 2nd.
The test we ran shows how dangerous, and blatantly racist it would be if UCLA had gone ahead with their proposal to implement facial recognition on campus. And it should be a warning to other schools who are considering using this technology. If administrators care about their community members’ safety and basic rights, they should follow the lead of dozens of other institutions and clearly commit to not using facial recognition on campus.
Media contact: email@example.com or 978–852–6457