I am listening to the soundtrack for Blade Runner 2049. This Orwellian tale right here in California fits in with the Dystopian mood of the music. This technology is terrifying and fascinating.
SACRAMENTO — San Francisco Assemblyman Phil Ting has never been arrested, but facial recognition technology developed by Amazon links his image to a jailhouse mugshot.
Ting is one of 26 state legislators who were wrongly identified as suspected criminals using the technology, according to results of a test released Tuesday by the American Civil Liberties Union of Northern California.
Matt Cagle, a technology and civil liberties attorney at the ACLU, said the organization ran its experiment using Amazon’s Rekognition software and screened 120 lawmakers’ images against a database of 25,000 mugshots.
Cagle said the technology poses “a public safety hazard and a threat to our fundamental rights” if used in police cameras.
“Body cameras were promised as a police accountability tool, not as a surveillance system,” he said. “People should be able to walk down the street without having their face logged into a government database.”
The program falsely identified 1 of 5 lawmakers, including Ting and two fellow San Francisco Democrats, Assemblyman David Chiu and state Sen. Scott Wiener, the ACLU said.
Critics of the software, including the ACLU, said the findings show the need to block law enforcement from using the technology in officers’ body cameras.
“Clearly, this software is faulty,” Ting said. “It really should not be used by any law enforcement agency at this point. Body cameras are there to build trust, not to tear it down.”
Ting’s bill is opposed by a host of law enforcement groups, including the California Peace Officers’ Association. They argue that the technology could be used to help identify criminals at large events, such as the 2028 Summer Olympics in Los Angeles.
Shaun Rundle, the association’s deputy director, said no California police agencies are using the technology now. But, he said, police should have a chance to show they could use it correctly.
“We’re concerned that (Ting’s bill) is an attempt to wipe out something that could identify repeat offenders, could solve cold cases and old crimes and deter future crime,” Rundle said. Facial recognition technology “has the potential to be a real crime-solving tool.”
Supporters say police could use the technology like a red-flag system, to alert officers if an image captured on a body camera matches that of a suspect or someone in an arrest database.
However, Microsoft announced in April that it had denied an unnamed California law enforcement agency’s request to buy its recognition technology for body cameras.
Amazon said it believes its technology, if used correctly, can help identify criminals and find missing children. The company recommends on its website that law enforcement agencies using its software, “manually review the match before making any decision to interview or detain the individual.”
Amazon suggests police agencies set the software’s confidence threshold setting — a measurement of how accurate the software considers its suggested matches — at 99%. Cagle said the ACLU used the program’s default 80% setting in its experiment with California lawmakers’ photos.
“We continue to advocate for federal legislation of facial recognition technology to ensure responsible use,” the Amazon spokesman said in an email. The company wants Congress to mandate that government agencies using facial recognition follow guidelines to protect civil rights.
Critics of the technology say it raises civil rights concerns in part because many of the people it misidentifies are people of color.
More than half the 26 California lawmakers who were falsely identified in the ACLU’s experiment are people of color, Ting’s office said. Ting said that makes the technology especially dangerous for African Americans, Latinos and Asian Americans.
“This could lead to more false arrests in those particular communities,” he said.
Last year, the ACLU ran a similar experiment using images of members of Congress. It found that Amazon’s program incorrectly matched 28 of them with suspected criminals.