The Current

Following wrongful arrest of Black man, advocates uneasy about 'flawed' facial recognition use in policing

The prevalence of facial recognition systems, which use artificial intelligence to match images of people with images of suspects, is growing in the U.S. and Canada. Recent studies have found that while the systems are fairly accurate when it comes to identifying Caucasian men, it can be faulty for other demographics.

'If we uploaded our fingerprints as our profile pictures on the internet, we would be horrified': researcher

Critics of facial recognition technology worry that its use in policing could lead to more wrongful arrests of Black people and people of colour, while advocates say that used responsibly, it can be an important investigative tool. (Kirill Kudryavtsev/AFP via Getty Images)
Listen20:25

Read story transcript

In what may be a first case of its kind, Robert Williams, a Black man in Detroit, was wrongly arrested after facial recognition software misidentified him as a person suspected of shoplifting — and critics say it highlights the potentially biased implications of artificial intelligence policing.

"The only reason, in my opinion, that this is the first case that we're hearing about is because this technology is typically used in secret," said Phil Mayor, senior staff attorney with the American Civil Liberties Union in Michigan that represented Williams.

"What's remarkable about this case, I think, is not that there was a false recognition, but rather that the individual was told," said Mayor, who spoke with The Current guest host Rosemary Barton.

The prevalence of facial recognition systems, which use artificial intelligence to match images of people with images of suspects, is growing in the U.S. and Canada. Several law enforcement agencies, including the RCMP and regional forces including the Toronto Police Service, have said they used the technology in the past.

Recent studies have found that while the systems are fairly accurate when it comes to identifying Caucasian men, it can be faulty for other demographics. A recent study from the National Institute of Standards and Technology found that false positives are 10 to 100 times more likely among African American and Asian people.

"The science on this is really quite clear: that this technology is flawed and that it is particularly bad at identifying faces of people from communities of colour and black faces in particular, and black female faces even more so," Mayor added.

In the Detroit case, first reported by the New York Times, Williams says that he received a phone call telling him to report to police for arrest. Thinking it was a prank call, he told them to serve him a warrant at home. When Williams pulled into his driveway on a Thursday this past January, police surrounded his car and arrested him on his front lawn, in view of his wife and children, without explanation, Mayor said.

"The officers refused to give him any facts about what he was accused of, and he didn't find out anything further about what he was accused of until after he spent a night in a filthy, overcrowded jail and was interrogated the next day," the lawyer added.

On Jan. 23, 2020, the case against Williams was dismissed without prejudice, meaning he could still be charged at a later date. In a statement to the New York Times, the Wayne County prosecutor's office apologized for detaining Williams, and stated that he can have his fingerprints data and his case expunged from his record.

Used as a tool

Daniel Castro, vice-president of the Information Technology and Innovation Foundation in Washington, D.C., who supports the technology, says that while the circumstances around the case of Robert Williams are "inexcusable" and involved "so many human failures," facial recognition systems can be responsibly used to identify suspects.

"When you have police investigating crimes, they're often trying to identify suspects, witnesses, even sometimes victims, and this has traditionally been a very manual and slow process," he said.

A third tech company has said it will stop selling facial recognition software to police forces because it fails to accurately recognize Black faces, especially those of Black women. Canadian police forces are also backing away. 1:47

Castro adds that facial recognition algorithms are meant to be a tool for police officers to use in addition to traditional policing techniques.

"What the technology does is it simply identifies potential matches, and those potential matches have been given to investigators who should be following up on those leads," he said. 

"They're not given to them and saying this person is guilty."

Technology companies, including Amazon, Microsoft and IBM, have halted or temporarily suspended sales of their facial recognition systems to police forces in light of concerns over racial bias.

With a wide range of systems available on the market, Castro says that the accuracy of facial recognition technology depends on the vendor. Rather than eliminating the systems altogether, Castro believes that there need to be stronger controls on what systems police can use, and greater transparency for citizens.

"If you look at the top performing ones, they're highly accurate. They're highly accurate in ways that surpass human recognition. So they're less biased than humans and they're more accurate than humans," he told Barton.

"What you see, even in Mr. Williams case, is that it was the human biases there of not seeing the results and not recognizing that it was clearly a different person that led to his arrest," he added.

'It's very sensitive data'

New York University's AI Now Institute tech fellow Inioluwa Deborah Raji says that the infancy of the technology means it may be too soon for police forces to rely on it for arrests — and that its use could lead to more cases like Williams.

"Given the current prevalence of the tool, we can see already that there's going to be definitely, you know, more false arrests, more mistakes pulling people into these systems," Raji, who researches racial bias in facial recognition algorithms, told The Current.

Facial recognition technology firm won’t allow Canadians to have their data removed 2:38

Asked whether facial recognition could help reduce human bias in police work, Raji pushed back.

"With something like facial recognition providing a lead, it's a justification in many cases for police to target specific minorities," she said, adding that police may use the systems to confirm a hunch or suspicion in absence of evidence.

Raji adds that police forces have been known to misuse the technology. She points to a 2019 report from the Georgetown Law Center on Privacy and Technology that found several examples of misuse, including the matching of potential suspects to a sketch or computer-generated composite image.

Also concerning, she says, is the use of facial recognition technology outside of law enforcement among governments for intelligence and immigration applications.

"I like to think about it as, if we uploaded our fingerprints as our profile pictures on the internet, we would be horrified," she said.

"A face is that equal level of identifiable biometric…. It's very sensitive data."


Written by Jason Vermes. Produced by Sarah-Joyce Battersby, Alison Masemann and Paul MacInnis.

now