The potential use of facial recognition technology by the Milwaukee Police Department has sparked controversy. The department is considering a deal with Biometrica, a private software firm, to exchange 2.5 million mugshots for free access to the company’s facial recognition software. This proposed exchange raises significant ethical and privacy concerns.
The Milwaukee Journal Sentinel recently reported on this potential deal, which was discussed at a Fire and Police Commissions meeting. The Milwaukee police have previously borrowed access to facial recognition technology from neighboring agencies. This new agreement would grant the department two free search licenses from Biometrica, a company that already works with law enforcement agencies across the U.S., in return for mugshots and jail records spanning decades.
While Biometrica’s intended use for these mugshots remains unconfirmed, it’s likely they would be used to train the company’s facial recognition software. This practice, while not unique, raises concerns about the ethical implications of using such data, especially given the potential for bias in facial recognition algorithms. Biometrica did not respond to MaagX’s request for comment. However, the use of potentially biased datasets in training facial recognition systems is a documented issue. For instance, Clearview AI has faced criticism for scraping millions of photos from social media for its database, and PimEyes has been scrutinized for using images of deceased individuals for its algorithm. Even the National Institute of Standards and Technology (NIST) maintains a mugshot database and images of vulnerable individuals for facial recognition testing, raising questions about privacy and consent.
The Milwaukee Police Department confirmed to MaagX that no contract has been signed yet and that further discussion will take place at future city meetings. A representative emphasized the department’s commitment to transparency with the community. Despite the lack of a finalized agreement, the proposal itself has raised significant concerns.
One key issue is the documented inaccuracy of facial recognition technology in identifying individuals with darker skin tones, particularly women and non-binary individuals. This inaccuracy has led to wrongful arrests, predominantly affecting Black individuals, as highlighted by David Gwidt, a spokesperson for the American Civil Liberties Union (ACLU) of Wisconsin, in a statement to MaagX.
Another concern is the lack of provisions for informing individuals about the use of their mugshots, obtaining their consent, or allowing them to opt out. Wisconsin, like many states, lacks specific biometric privacy laws. While mugshots are generally considered public records in Wisconsin, the ethical implications of using them to train facial recognition software without consent remain a significant point of contention. This raises concerns about the potential for data breaches and the misuse of sensitive biometric information, as biometric data, unlike passwords, cannot be changed.
This proposed deal reflects a history of ethical lapses and the exploitation of marginalized communities in the pursuit of technological advancement. Jeramie Scott, Senior Counsel at EPIC, pointed out to MaagX the irony of using mugshots, disproportionately representing people of color, to train a surveillance technology likely to be disproportionately used against the same communities. This practice, he argues, exacerbates existing racial inequalities within the criminal justice system.
Comprehensive federal regulation of facial recognition technology is unlikely in the near future. While Madison, Wisconsin’s capital, banned the technology in 2020, the state itself lacks such regulations. Milwaukee also has no regulations governing the police department’s existing surveillance technology. Scott recommends against the proposed deal and urges the Milwaukee police to refrain from using facial recognition technology without strict legal limitations and safeguards.
The ACLU of Wisconsin has called for a two-year moratorium on new surveillance technologies in Milwaukee and the development of regulations for existing technologies, with community input. Although the Milwaukee Police Department has stated it will develop a policy to prevent arrests based solely on facial recognition matches, there are currently no mechanisms for accountability.
This situation highlights the ongoing debate surrounding the ethical and practical implications of facial recognition technology, especially within the context of law enforcement. The potential for misuse and bias underscores the need for careful consideration and robust regulations to protect individual rights and prevent the perpetuation of systemic inequalities.