Actions

Body cam maker to study facial recognition risks and benefits

Axom to study body cam facial recognition
Posted at 7:43 PM, Apr 27, 2018
and last updated 2018-04-27 21:43:57-04

DENVER -- Body cameras are a critical but controversial tool for police departments nationwide.

That controversy is going to ramp up as technology improves, specifically facial-recognition technology, which is already in use in China. 

Here in the U.S., one of the top manufacturers, Axom, has formed an ethics board to look at the risks and benefits of using facial-recognition on body cams.

That move is raising some concerns.

"It's unavoidable that it will make some mistakes," said Laura Moy, Deputy Director at the Center on Privacy and Technology at Georgetown Law. "If you have software identifying people out on the street mistakenly, or erroneously, to an officer who is armed with a lethal weapon, then that could result in lethal mistakes being made."

Moy told Denver7 by phone that the biggest problem with facial recognition is that "it is not 100 percent accurate."

The Center on Privacy and Technology has joined 38 other civil rights and civil liberties groups that signed a letter urging the Axon Board to assert that:

  • Certain products are categorically unethical to deploy
  • Robust ethical review requires centering the voices and perspectives of those most impacted by Axon’s technologies
  • Axon must pursue all possible avenues to limit unethical downstream uses of its technologies
  • All of Axon’s digital technologies require ethical review

The letter states that "law enforcement in this country has a documented history of racial discrimination. Some agencies have routinely and systematically violated human and constitutional rights. Some have harassed, assaulted, and even killed members of our communities. These problems are frequent, widespread and ongoing."

The letter goes on to say, "Because Axon's products are marketed and sold to law enforcement, they sometimes make these problems worse. For example, Axon's body-worn camera systems, which should serve as transparency tools, are now being reduced to powerful surveillance tools that are concentrated in heavily police communities."

Several police departments in Colorado use body cams.

Moy said the software doesn't tell the user that the person on the camera is the person they're looking for, it tells them the person on camera looks like the person they're looking for.

Moy also mentioned the possibility of bias.

"There could be differences in error rates across different demographic groups," she said. "A recent MIT study guessing whether the person in the photo was a man or a woman, there were much higher error rates when looking at faces of dark skinned women than any other demographic group."

Axon's Director of Artificial Intelligence and Machine Learning, Moji Solgi, issued a statement saying: "At this point in time, we are not working on facial recognition technology to be deployed on body cameras.  While we do see the value in this future capability, we also appreciate the concerns around privacy rights and the risks associated with mis-identification of individuals. Accordingly, we have to chose to first form an AI Ethics board to help insure we balance both the risks and benefits of deploying this technology. At Axon we are committed to ensuring (sic) that technology we develop makes the world a better and safer place."