Contributed by -

Clare Garvie

Training & Resource Counsel Fourth Amendment Center, National Association of Criminal Defense Lawyers

Julian Wallace

Education & Research Associate Fourth Amendment Center, National Association of Criminal Defense Lawyers

 

Algorithms, like those used in facial recognition systems, are often touted as bringing us into an era of “modern” policing. Proponents argue they make investigations more efficient, effective, and, most importantly, equitable — by eliminating areas where human decision-making introduces error and bias. But in the two decades since police first used facial recognition, the promise of equity has not been borne out. The technology may introduce unique risks of error and bias into policing while simultaneously entrenching existing racial biases and hiding those biases from the judicial process.   

On January 30, 2019, Nijeer Parks discovered there was a warrant out for his arrest in a New Jersey town he had never visited. When he arrived at the police station to clear up the mistake, police arrested and charged him with assault. He spent ten days in custody before detectives realized their error. During this time, and in the six months it took for the charges to be dropped, Mr. Parks incurred thousands of dollars in legal fees and considered pleading guilty to guarantee a six-year sentence instead of up to twenty years threatened should he go to trial. 

That initial error? A police facial recognition algorithm mistakenly “matched” his photo to the driver’s license left at the crime scene. Trusting the algorithm, investigators homed in on and arrested an innocent man.[i]

Nijeer Parks is one of at least six people who have been wrongfully arrested following a facial recognition misidentification. While all were ultimately released, each experienced direct, negative, and lasting impacts stemming from the ordeal — lost wages or employment, legal fees, and physical and psychological trauma to themselves and their families. Like Mr. Parks, others were faced with the difficult choice of whether to plead guilty to something they didn’t do to avoid the “trial penalty,” a greater sentence should they choose to fight the charges in a lengthy court case. [ii] 

It does not appear to be a coincidence that all these individuals are Black. 

We don’t know how many more people have been wrongfully arrested because of this technology or the race of each of the accused. A persistent lack of transparency means that, in many cases, defendants never learn it was an algorithm that led to their identification — or misidentification — as the suspect. But what we do know is that facial recognition systems may perpetuate racial bias in the criminal legal system in several ways. 

First, facial recognition systems risk entrenching historical patterns of over-policing within Black and Brown communities. People of color, particularly Black men, are disproportionately enrolled in many facial recognition databases because of systemic inequities dictating who has been arrested and had their mugshots taken. This means that Black men bear a higher risk of being misidentified as a criminal suspect by a facial recognition search. Black and Brown communities are also likely to be targeted disproportionately by police use of the technology, thanks to pre-existing patrol patterns targeting these same communities. In San Diego, for example, a governmental audit found that police used the technology up to 2.5 times more on communities of color than anyone else. [iii] 

Second, facial recognition algorithms may represent a unique, additional source of bias in police investigations. Studies have found that the accuracy of many algorithms varies depending on the race, sex, and age of the person being searched; who you are and what you look like impacts your risk that the system will misidentify you. Using country of origin as a proxy for race, a federal study from 2019 found higher misidentification rates for people of East and West African descent than for Eastern European subjects, a finding supported by other research as well. [iv]

Third, the increased reliance on algorithms creates systematic barriers to due process across our criminal legal system by hiding crucial decision-making steps — and evidence — from the accused. Police often deploy facial recognition and other advanced systems with little to no notice, transparency, or public oversight. Defendants may never learn that facial recognition or another algorithm was used to select data implicating them in a crime. [v] 

And even if they do, many of these algorithms are “black boxes,” systems whose processes cannot be explained or “interrogated” in a courtroom setting. Companies developing the algorithms sometimes assert a proprietary or trade secret interest in their products, seeking to bar disclosure of information about how they work. [vi] In the words of one public defender, “We’ve moved from a system of people-as-accuser to technology-as-accuser, and then we don’t require the technology to explain how it came to its decisions or allow the defense to ask questions.”[vii] This undermines the right to a fair trial for everyone — but particularly those disproportionately policed, arrested, and charged in this country.  

The use of facial recognition and other computer algorithms in the criminal legal system is prevalent — and growing, seen as a way to bring policing up to date with modern technology. But without careful consideration of how these systems perpetuate biases, we are not moving in the right direction. 

“There’s nothing ‘modern’ about a policing system that embraces the same end goals, ongoing racial bias, by using newer and shinier technology,” says Lisa Wayne, the first Black Executive Director of the National Association of Criminal Defense Lawyers. “It is unacceptable that contemporary policing tools perpetuate old harms and introduce new ones that are disproportionately borne by poor and Black and Brown communities.” 

We know enough about the risks of racial bias posed by facial recognition to counsel against its use in policing. These lessons, learned at the expense of the accused over the past twenty years, should also serve as a roadmap for every other algorithm introduced into our criminal legal system. Any algorithm that cannot be tested and explained transparently or that has already been demonstrated to have a propensity for racial bias has no role to play in policing in the 21st century. 

  [i] See Anthony G. Attrino, He spent ten days in jail after facial recognition software led to the arrest of the wrong man, the lawsuit says, N.J. Advance Media (December 29, 2020), https://www.nj.com/middlesex/2020/12/he-spent-10-days-in-jail-after-facial-recognition-software-led-to-the-arrest-of-the-wrong-man-lawsuit-says.html.

[ii] See Elisha Anderson, Controversial Detroit facial recognition got him arrested for a crime he didn’t commit, Detroit Free Press, July 10, 2020, https://www.freep.com/story/news/local/michigan/detroit/2020/07/10/facial-recognition-detroit-michael-oliver-robert-williams/5392166002/see Kashmir Hill, Wrongful Accused by an Algorithm, NYTimes (June 24, 2020), https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.htmlsee Eyal Press, Does A.I. Lead Police to Ignore Contradictory Evidence, New Yorker, https://www.newyorker.com/magazine/2023/11/20/does-a-i-lead-police-to-ignore-contradictory-evidence; see Thomas Germain, Innocent Black Man Jailed After Facial Recognition Got It Wrong, His Lawyer Says, Gizmodo (January 3, 2023), https://gizmodo.com/facial-recognition-randall-reid-black-man-error-jail-1849944231see Kashmir Hill, Eight Months Pregnant and Arrested After False Facial Recognition Match, NYTimes (August 6, 2023), https://www.nytimes.com/2023/08/06/business/facial-recognition-false-arrest.html.

[iii] See Garvie et al., The Perpetual Line-Up: Unregulated Police Face Recognition in America, https://www.perpetuallineup.org/findings/racial-bias

[iv] Patrick Grother, Mei Ngan, and Kayee Hanaoka, Face Recognition Vendor Test Part 3: Demographic Effects, National Institute of Standards and Technology (Dec. 2019), https://nvlpubs.nist.gov/nistpubs/ir/2019/NIST.IR.8280.pdfSee Cynthia M. Cook et al., Demographic Effects in Facial Recognition and their Dependence on Image Acquisition: An Evaluation of Eleven Commercial Systems, IEEE Transactions on Biometrics, Behavior, and Identity Science (Feb. 2019), https://mdtf.org/publications/demographic-effects-image-acquisition.pdfsee Krishnapriya K. S. et al., Characterizing the Variability in Face Recognition Accuracy Relative to Race, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2278-2285 (2019), doi: 10.1109/CVPRW.2019.00281. 

[v] See Garvie, A Forensic Without the Science: Face Recognition in U.S. Criminal Investigations, http://forensicwithoutscience.org/

[vi] See Moore, Trade Secrets and Algorithms as Barriers to Social Justicehttps://cdt.org/wp-content/uploads/2017/08/2017-07-31-Trade-Secret-Algorithms-as-Barriers-to-Social-Justice.pdf

[vii] From an interview conducted by the author. Notes on file with author. 

Our Partners


Key partners supporting the National Urban League's mission for State of Black America Report

Subscribe our newsletter!

Scroll to Top