Skip to content
Naked Security Naked Security

College students call for ban on facial recognition on campus

Fight for the Future is building on its success in pressuring concert promoters to back off of plans to use the technology at festivals.

US elementary and grade schools have been experimenting with facial recognition. But so far, it hasn’t been widely adopted at colleges – and students are taking part in a nationwide effort to keep it that way.
On Tuesday, the digital rights group Fight for the Future announced that it’s teamed up with Students for Sensible Drug Policy to ban the biometric technology from university campuses. The groups have launched a website and toolkit for campuses that want to join in.
So far, student groups have been organized at George Washington in DC and DePaul in Chicago. Activists are also reaching out to 40 universities – including Stanford, Harvard, and Northwestern – to determine whether they’re using what Fight for the Future calls “this problematic technology.”
Evan Greer, Deputy Director of Fight for the Future, says that students have a right to know if their schools’ administrations plan to experiment on them with biometric surveillance:

Facial recognition surveillance spreading to college campuses would put students, faculty, and community members at risk. This type of invasive technology poses a profound threat to our basic liberties, civil rights, and academic freedom.
Schools that are already using this technology are conducting unethical experiments on their students. Students and staff have a right to know if their administrations are planning to implement biometric surveillance on campus.

Erica Darragh, board member at Students for Sensible Drug Policy, said that facial recognition doesn’t make anybody safer. Rather, it just tramples on privacy rights, particularly those of minorities.
An oft-cited study from Georgetown University’s Center for Privacy and Technology found that automated facial recognition (AFR) is an inherently racist technology. Black faces are over-represented in face databases to begin with, and FR algorithms themselves have been found to be less accurate at identifying black faces.
In another study published last year by MIT Media Lab, researchers confirmed that the popular FR technology it tested has gender and racial biases.
Among many other privacy rights groups and FR critics, the American Civil Liberties Union (ACLU) has argued that extensive video surveillance systems simply don’t work.


For example, the terrorists behind the June 2007 attempted attacks in the UK weren’t put off by the country’s enthusiastic embrace of surveillance cameras. In fact, suicide attackers may well be attracted by the television coverage that such cameras deliver, the ACLU suggests.

The attacks were fortunately botched, but not for reasons that had anything to do with surveillance cameras. In London it was human observation and common-sense that appears to have thwarted the attack. In Glasgow, it was physical security – airport barriers – that prevented the attack from succeeding.

Neither has the technology cut down on petty crime. Take Notting Hill Carnival for example. After years of high failure rates when trying to implement the technology, London’s Metropolitan Police finally threw in the towel in 2018. In 2017, the “top-of-the-line” AFR system they’d been trialling for two years couldn’t even tell the difference between a young woman and a balding man.
This is only the latest of Fight for the Future’s multiple campaigns to stop the steady march of facial recognition. In July 2019, the group called for a federal ban on FR surveillance, and in October 2019, artists and fans waged – and won – a grassroots war to stop “the Orwellian surveillance technology” from invading live music events, such as the enormous Coachella, Bonnaroo, and SXSW music festivals.
As Greer and the guitarist Tom Morello noted when writing up the victory for Buzzfeed News, the anti-AFR campaign caused concert promoters to back off of plans to use the technology. Ticketmaster, for one, had invested in military-grade facial recognition software it had planned to use to link digital tickets with concertgoers’ images so they could “just walk into the show.”
The technology belongs neither at concerts nor on campuses, Darragh says:

Students should not have to trade their right to privacy for an education, and no one should be forced to unwittingly participate in a surveillance program which will likely include problematic elements of law enforcement. This automation of racial and political profiling threatens everyone, especially students, faculty, and campus guests of color. Students have an obligation to prevent this technology going mainstream, beginning with university campuses, where we have the most power and we know how to win.

The call for a campus ban on FR is part of Fight for the Future’s broader BanFacialRecognition.com campaign, which has been endorsed by more than 30 civil rights groups. They’re calling for local, state, and federal lawmakers to ban government and law enforcement use of facial recognition. Four cities have already banned the controversial technology, including San Francisco; Somerville, in Massachusetts; Berkeley, California; and Oakland, California.

3 Comments

I don’t understand the differences between a security camera and a security camera with AFR; to me it’s about real-time or not to facial recognition. The same with “basic liberties, civil rights, and academic freedom” if concerned. Maybe the facial “database” is what we concern; not the camera nor the technology.
If about “inherently racist technology”, no, it’s not the “technology” but the “usage or setup or configuration or weighting or …” that needs to resolve. A knife could save or kill a life; not the knife problem!

Reply

Students for Sensible Drug Policy — back when I was in college years ago we had students that would fit the name of this group. They were anything but sensible.

Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!