Jesus' Coming Back

Facial Recognition Technology Test Has Massive Failure, Matches 20% Of California Legislature With Wanted Criminals

Artificial Intelligence technologies are going to be used extensively in the future. One of those applications is attempting to detect criminal faces in a crowd. But how accurate will such technologies be, and will they unintentionally implicate innocent people?

In an interesting and ironic twist, a test of a facial recognition system for criminals on the California Legislature mismatched one-in-five faces, as such associating 20% of the legislature with people wanted for criminal activity according to a report:

California Assemblyman Phil Ting has never been arrested, but he was recently mistaken for a criminal.

He’s not surprised.

Ting (D-San Francisco), who authored a bill to ban facial recognition software from being used on police body cameras, was one of 26 California legislators who was incorrectly matched with a mug shot in a recent test of a common face-scanning program by the American Civil Liberties Union.

About 1 in 5 legislators was erroneously matched to a person who had been arrested when the ACLU used the software to screen their pictures against a database of 25,000 publicly available booking photos. Last year, in a similar experiment done with photos of members of Congress, the software erroneously matched 28 federal legislators with mug shots.

The results highlight what Ting and others said is proof that facial recognition software is unreliable. They want California law enforcement banned from using it with the cameras they wear while on duty.

“The software clearly is not ready for use in a law enforcement capacity,” Ting said. “These mistakes, we can kind of chuckle at it, but if you get arrested and it’s on your record, it can be hard to get housing, get a job. It has real impacts.”

Ting’s proposal, Assembly Bill 1215, could soon be on the governor’s desk if it passes the Senate. Sponsored by the ACLU, the civil rights organization hopes its recent test will grab attention and persuade legislators to put the technology on hold.

There is little current federal regulation of facial recognition technology. Recently, members on both sides of the aisle in Congress held oversight hearings and there has been a strong push by privacy advocates for federal action. But concrete measures have yet to materialize.

That has left states and local jurisdictions to grapple with the complex technology on their own. New Hampshire and Oregon already prohibit facial recognition technology on body-worn cameras, and San Francisco, Oakland and Somerville, Mass., also recently enacted bans for all city departments as well as police.

“I think it’s extremely important for states to be regulating the use of technology by police,” said Barry Friedman, a privacy expert and professor of law at New York University. “It is the Wild, Wild West without a regulatory scheme. Regulation is what we need.”

Friedman serves on an ethics committee for Axon, one of the largest manufacturers of body-worn cameras. The company has publicly said it will not put facial recognition technology on its cameras because it doesn’t have confidence in its reliability. Microsoft, which makes a facial recognition product, also recently said it had refused to sell it to a California law enforcement agency. The moves mark an unusual position from corporations seeking boundaries for their products.

“The body camera technology is just very far from being accurate,” Friedman said. “Until the issues regarding accuracy and racial bias are resolved, we shouldn’t be using it.”

But other companies are moving ahead with facial recognition, including Amazon, developer of Rekognition, the software used in the ACLU tests. Government agencies including ICE have also reportedly used the technology, culling through databases of driver’s licenses.

Proponents of the technology contend it could be an important law enforcement tool, especially when policing large events or searching for lost children or elderly people. The bill is opposed by many law enforcement groups.

Amazon said it could not immediately comment on the most recent ACLU test, but has previously disputed that the Rekognition software was unreliable, questioning the group’s methods of scanning members of Congress. In its developer guide, Amazon recommends using a 99 percent confidence threshold when matching faces, and criticized the ACLU for using a lesser bar — the factory setting for the software, according to Matt Cagle, an attorney with the Northern California chapter of the ACLU — when testing it.

The Ting proposal would make California the largest state to ban the software, potentially having a “ripple” effect, Cagle said. The bill would ban not just facial recognition, but other “biometric surveillance systems” such as those that analyze a person’s gait or log tattoos.

Critics contend that the software is particularly problematic when it comes to identifying women, people of color and young people. Ting said those demographics were especially troubling to him, since communities of color have historically often been excessively targeted by police, and immigrant communities are feeling threatened by federal crackdowns on illegal immigration.

Police body cameras, he said, have gained popularity in recent years as a police accountability measure in the wake of shootings of black and brown men across the country, including the 2014 death of Michael Brown in Ferguson, Mo., which garnered national attention for the issue.

Transforming body cameras from an accountability measure to a surveillance tool would undermine their purpose, Ting said.

“Body cameras were really deployed to build trust between law enforcement and communities,” said Ting. “Instead of trust, what you are getting is 24/7 surveillance.” (source, source)

If these results are for a test, what will happen when such systems are put to public use?

One can only imagine.

Comments are closed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More