By Alison Macrina
I recently created a Google alert for “facial recognition”. I guess I felt like there just wasn’t enough to worry about these days, but fortunately now the notification pings my inbox daily with five to ten links to dystopian horrors. Some of these are stories from critical reporters, many others are breathless content-mongers who’ve just copy-pasted corporate press releases about this wonderful new technology. But no matter their angle, all of them raise my blood pressure. Here are some of those headlines that reached my inbox last week:
Taser Stun Guns Maker Has Developed Facial Recognition Software
Brownsville Residents Alarmed by Landlord’s Plan to Install Facial Recognition System
FamilyMart Adopts Facial Recognition Checkouts to Cut Need for Staff
Your Social Media Photos Could Be Training Facial Recognition AI Without Your Consent
From law enforcement to app developers, everyone seems to be making use of this technology, yet few people seem to be thinking deeply about the kind of world it’s helping create. So I asked someone who I know thinks deeply about these things, my friend Kade Crockford at the ACLU of Massachusetts. I asked Kade for an example of how this technology is being used to violate our civil liberties.
Kade told me that in Massachusetts, the Registry of Motor Vehicles allows law enforcement to use the drivers’ license database as a face recognition search tool for law enforcement. This means that if a cop or ICE agent has a photo they want to identify, they send it to the RMV, who run the photo against the license database and send the results back to law enforcement. This is done with no probable cause nor judicial oversight, and the system has never been audited for bias. Kade told me they’d spoken to local public defenders who didn’t even know that this technology existed – even though it’s been at work for 13 years. If this tech is used in criminal cases without disclosing that information to the defense, that’s a huge violation of due process.
Lest you think that the problems with this software are merely about our rights, the tech itself is a hugely inaccurate, discriminatory mess. But the problems don’t just go away if we decide to somehow make the algorithms less racist. This tech exacerbates other social issues, like labor rights and our ability to move freely in public spaces without private companies breathing down our necks, and often the images used to train the databases are taken without our consent. And yet, with all these issues, there are many otherwise thoughtful people who maintain that these technologies are inevitable, and that fighting them is futile. This makes me think of Ursula K. LeGuin, the brilliant writer and anarchist, who wrote, “We live in capitalism. Its power seems inescapable. So did the divine right of kings. Any human power can be resisted and changed by human beings.”
Technology and inevitability also makes me think about the Luddites. They’re a historically misunderstood movement, but I think their true history is important in the current political moment. Technology itself was not the Luddites’ enemy. When “General Ned Ludd” (a pseudonym employed by the workers to publish manifestos about their cause) urged English weavers to smash the new machines that would put them out of work, it was at a time in the early 19th century when the ruling class was brutally suppressing collective organizing and suppressing the wages of laborers to starvation levels. The Luddites saw this new technology, designed to be operated by unskilled labor who could be paid even less than their own starvation wages, for what it was — a “fraudulent and deceitful” mechanism to further undermine worker power. In response, the Luddites rose up and smashed textile machinery, and were met with extreme violence from the English government and wealthy mill owners.
I’m not advocating property destruction, because that would be illegal, and 200 years after the Luddites our own ruling class still seems to value property rights over human rights like privacy. But I think we should take a cue from the Luddites about how to view our own modern technology. Is it actually bringing convenience to our lives, or is it also “fraudulent and deceitful”? Does it enrich us, or does it make it easier for police to violate privacy and due process rights, and more efficiently lock people in cages? Is the private sector acting in our best interest, or are they further commercializing and surveilling our public spaces? And how might we resist this new technology, other than pushing the facial recognition robots into the river or smashing the CCTV cameras?
To me, our collective future depends on our capacity to get organized. In our conversation, Kade pointed out the huge numbers of tech workers starting to question what they’re building, and whether they should work for companies aggressively selling this technology to cops and ICE. I think in particular about Google employees who successfully killed the contract for Project Maven, the Pentagon artificial intelligence program. How might librarians join this collective refusal? How can we build power in our communities to say no? How can we use our role to teach the public about what’s happening with facial recognition tech and more? Here are some ideas:
- First, remember that libraries have a mandate to protect privacy. It’s in our Code of Ethics and our Bill of Rights, it’s inherent to what we do. Then, don’t be afraid to stand up for privacy in your community. We librarians tend to be very politically risk-averse, but remember that political is not the same as partisan. Everything is political. Act accordingly.
- Programmatic options are nearly endless, but me, I love a good panel discussion. Invite local lawmakers, ACLU representatives, academics, and community organizers to talk together about the effect of facial recognition on our lives. Ask each panelist to close the discussion with their own ideas for a call to action.
- Run a program on using FOIA (Freedom of Information Act requests) to learn about the use of facial recognition by your local law enforcement. This would be especially fun to do with young people. Use MuckRock to simplify the process, or search their existing results for your own local community, and then…
- Use the FOIA results or recent news reports to organize your patrons around contacting local legislators about the use of this technology. Demand oversight and accountability. This could be a full-on program or a more passive activity, where you set up a display with the legislators’ contact info as well as scripts for emails and phone calls.
- Make sure your privacy policies specify that the use of facial recognition technology on library property is forbidden without affirmative consent. Bear in mind that many new CCTV systems have facial recognition capabilities.
- Remove CCTV cameras on library property, or don’t get them installed in the first place. If this seems extreme to you, read this paper. If this is not possible for you, make sure you know exactly where the feed is going, and insist in your policies that law enforcement can only obtain the footage with a warrant.
- Run a program on creating makeup looks with dazzle camouflage, or experiment with creating dazzle camouflage masks. Anti-surveillance, but make it fashion!
- Invite members of Library Freedom Project to give a talk or training at your library about facial recognition and other surveillance technologies.
- Demand investment in our social environments instead: better education, healthcare, green infrastructure, and public libraries will build a much better world than whatever facial recognition is creating.
Unimaginative people will dismiss these ideas as utopian. But I wonder how many of us became librarians in the first place because we’re more than a little bit utopian? It’s in our power to not only envision a better world, but to create it. Let’s take back the future together.
Alison Macrina is the founder and director of Library Freedom Project. Her work focuses on practical ways that librarians can build collective power to fight surveillance and make the internet more free. She was a Library Journal Mover and Shaker and has received the Free Software Foundation’s Award for Social Benefit as well as the New York Library Association’s Intellectual Freedom Award.