On January 26, 2021, ALA Council recently approved and passed two privacy-related resolutions at the 2021 ALA Midwinter Virtual conference, specifically the “Resolution on the Misuse of Library Users Through Behavioral Tracking” and “Resolution in Opposition to Facial Recognition Software in Libraries”.
The IFC Privacy Subcommittee drafted the Behavioral tracking resolution (CD # 19.3), largely after a town hall event at the end of 2020 entitled ‘Surveillance in Academic Libraries?! A Search for Better Ideas’. The working group included members from the Library Freedom Institute and Digital Library Federation, and the resolution was endorsed in principle by the Intellectual Freedom Round Table.
The behavioral tracking resolution is against any type of behavioral data surveillance of library use and users and urges libraries (and vendors) to never exchange user data for financial discounts, payments, or incentives. The resolution further calls on libraries and vendors to apply the strictest privacy settings by default (without any manual input from the end-user) and urges libraries, vendors, and institutions to not implement behavioral data surveillance or use that data to deny services.
There is further work inferred in the resolution in the call on libraries to employ contract language that does not allow for vendors to implement behavioral data surveillance or use that data to deny access to services, and additional work to also oversee vendor compliance with contractual obligations.
In the resolution, there is also a call for the library to act as “information ﬁduciaries”, assuring that in every circumstance the library user’s information is protected from misuse and unauthorized disclosure, and also ensuring that the library itself does not misuse or exploit the library user’s information. I think this point, in particular, is one to ponder more in-depth at a local level, and really contemplate all the systems and methods information is both created and captured at all points of the library, both in inadvertent and advertent usage.
During council discussion, one councilperson asked if the resolution should call upon a unit of ALA to monitor trends, developments, practices and report on them periodically. Another councilperson asked, “Is it fair to ask library workers to act in the role of information guardians is uncomfortable without having proper tools, and should be administrators working with vendors?” with resulting discussion around education for library workers on the topic. It is certainly an interesting point, as this adds another layer of expectation in the awareness, education, and general wherewithal on the topic of data surveillance and tracking, but is surely part of the modern world. The point does make me think about the ‘how’ and ‘when’ such training would/could take place, and perhaps there is a larger question for ALA to tackle on upskilling library workers en masse.
And also presented by the ALA Intellectual Freedom Committee (IFC), the other privacy-related resolution, the “Resolution in Opposition to Facial Recognition Software in Libraries (CD # 19.2)”, stems from work and a survey completed by the IFC Facial Recognition Working Group. The survey summary for the Facial Recognition Survey provides an overview of the survey that was completed in early 2020, which garnered over 600 survey responses.
The resolution emphasizes that the issue of the use of facial recognition technology is largely inconsistent with the Library Bill of Rights, as well as other ALA policies. ALA advocates for users to have the right to access library materials and spaces without having their privacy invaded, which facial recognition would impede upon that privacy expectation. The resolution also points to recent efforts in Congress to regulate and restrict facial recognition and biometric technology, and perhaps in contrast with current practices in law enforcement using these technologies which have not had sufficient oversight or standards being put in place.
Further, there have been studies on the extreme gender and racial bias in facial recognition software, with a noted prevalence of racist misidentification, (Resolution includes the citations of a number of studies on this topic:
- NIST Study Evaluates Effects of Race, Age, Sex on Face Recognition Software, National Institute of Standards and Technology, May 18, 2020;
- Larry Hardesty, Study finds gender and skin-type bias in commercial artificial-intelligence systems, MIT News, February 11, 2018;
- Erik Learned-Miller, Vicente Ordóñez, Jamie Morgenstern, and Joy Buolamwini, Facial Recognition Technologies in the Wild: A Call for a Federal Office, Algorithmic Justice League, May 29, 2020;
- Nicolás Rivero, The Influential Project That Sparked the End of IBM’s Facial Recognition Program, Quartz, June 10, 2020;
- Alex Najibi, Racial Discrimination in Face Recognition Technology, Harvard University, October 24, 2020;
- Steve Lohr, Facial Recognition Is Accurate, if You’re a White Guy, New York Times, February 9, 2018;
- James Vincent, Google ‘fixed’ its racist algorithm by removing gorillas from its image-labeling tech, The Verge, January 12, 2018.
Conversation during the council meeting included a consideration if this meant the use of facial recognition technologies in libraries, but should it extend to the grounds of libraries as well? (And Twitter reflection on what a library is- Does this extend to library services/entities beyond the library, like bookmobiles?)
And further, how to further refine facial recognition software that is used in libraries or by libraries as a clarification, particularly when the library is part of a larger entity.
And like the first resolution, there was also a discussion for the need for education around this technology and topic, particularly for staff, users, trustees, administrators, community organizations, and legislators about facial recognition technologies, their potential for bias and error, and the accompanying threat to individual privacy. Another follow-up question was if this also applies to video monitoring systems in libraries?
Ultimately, ALA has moved that the organization opposes the use of facial recognition software. The resolution asks any library or partner currently using it to desist, and also recommends that libraries, partners, and affiliate organizations engage in activities to educate staff, users, etc. about facial recognition technologies, particularly in the potential for bias and error, and the accompanying threat to individual privacy.
These are two timely resolutions that will presumably result in further action and education to implement and execute.
Virginia Dressler is the Digital Projects Librarian at Kent State University. Her specialty areas are project management and digitization, working primarily with the university’s unique collections. She holds a Master’s of Library and Information Science from Kent State University (2007), a Master’s of the Arts in Art Gallery and Museum Studies from the University of Leeds (2003) and a certificate in advanced librarianship (digital libraries) from Kent State University (2014). Her research areas include privacy in digital collections and the Right to be Forgotten. She is author of Framing Privacy in Digital Collections with Ethical Decision Making (Morgan & Claypool, 2018).