Detroit Police Reforms AI Facial Recognition Use Following Lawsuit

By freepik

Detroit Police Reforms AI Facial Recognition Use Following Lawsuit

Reading time: 15 min

  • Kiara Fabbri

    Written by: Kiara Fabbri Multimedia Journalist

  • Justyn Newman

    Fact-Checked by Justyn Newman Head Content Manager

In a landmark settlement this week, Robert Williams, reached an agreement with the City of Detroit. This comes after his wrongful arrest in 2020 based on faulty facial recognition technology. The incident highlighted concerns about the technology’s accuracy and potential for racial bias.

On January 9, 2020, Detroit police officers arrested Williams on his front lawn, in front of his wife and two young daughters. “They refused to tell me why, and I had to spend the night sleeping on a cold concrete bench in an overcrowded, filthy jail cell before finally finding out that I was being falsely accused of stealing designer watches from a Detroit boutique,” Williams told TIME.

During interrogation, it was revealed they relied solely on facial recognition software that incorrectly identified Williams from blurry security footage. Williams spent 30 hours in jail before the charges were dropped.

“The system somehow returned my expired driver’s license photo as an ‘investigative lead’ that might match the thief,” Williams explained. “Rather than investigate the accuracy of this purported match, police accepted the ‘lead.'”

While the settlement doesn’t ban facial recognition entirely, it imposes stricter controls on the Detroit Police Department’s (DPD) use of the technology. Crucially, photo lineups can no longer be based solely on facial recognition matches. Police must now uncover independent evidence before conducting a lineup after using the technology.

The incident highlighted broader issues with facial recognition technology. Past studies have shown that law enforcement agencies using automated facial recognition disproportionately arrest Black people. Factors contributing to this include the lack of Black faces in the algorithms’ training data sets, a belief in the infallibility of these programs, and officers’ own biases magnifying these issues.

Capitol Technology University highlights the systemic biases inherent in historical data used to train algorithms, stating, “If the historical data used to train algorithms and develop technologies reflects systemic biases, they are likely to perpetuate those same historically unequal opportunities and exacerbate inequalities.” Alarmingly, facial recognition systems have been shown to misidentify people of color up to 100 times more frequently than White Americans.

In a statement Williams emphasizes the human cost of misused technology: “I never thought I’d have to explain to my daughters why daddy got arrested,” Williams stated in a Washington Post interview. “How does one explain to two little girls that a computer got it wrong, but the police listened to it anyway?”

The settlement in Williams’ case represents a step towards preventing future wrongful arrests, but the fight against potential racial bias in facial recognition software continues.

Did you like this article? Rate it!
I hated it I don't really like it It was ok Pretty good! Loved it!
5.00 Voted by 1 users
Thanks for your feedback