Amazon’s latest update to its Rekognition software is renewing an outcry from civil rights groups over facial recognition technology. Amazon said Tuesday that its algorithm can now detect fear from the expression on a person’s face.
This fear-detection capability, along with seven other emotions, is now part of the default set of capabilities for customers who use Amazon Rekognition to identify people from databases of images and videos. Those customers include several law enforcement agencies, which a key concern for civil rights groups.
The American Civil Liberties Union is aggressively lobbying for legislation to slow or stop police adoption of facial recognition technology. The ACLU has conducted several studies that show Rekognition misidentifying people of color and women more frequently than white men.
Amazon said in the announcement this week that it has “improved accuracy for emotion detection (for all 7 emotions: ‘Happy’, ‘Sad’, ‘Angry’, ‘Surprised’, ‘Disgusted’, ‘Calm’ and ‘Confused’) and added a new emotion: ‘Fear’.”
Amazon’s update to Rekognition comes one day after the ACLU released a new study tied to a bill under consideration in the California legislature. The ACLU says that facial recognition software falsely matched 26 members of the California legislature with mugshots from a public database. The goal of the study was to encourage lawmakers to approve a bill that would ban facial recognition software in police-worn body cameras.
“These cameras were promised to communities for officer accountability and transparency, not for surveillance,” said ACLU of California attorney Matt Cagle at a press conference.
An Amazon spokesperson disputed the ACLU’s findings and provided this statement.
“The ACLU is once again knowingly misusing and misrepresenting Amazon Rekognition to make headlines. As we’ve said many times in the past, when used with the recommended 99% confidence threshold and as one part of a human driven decision, facial recognition technology can be used for a long list of beneficial purposes, from assisting in the identification of criminals to helping find missing children to inhibiting human trafficking. We continue to advocate for federal legislation of facial recognition technology to ensure responsible use, and we’ve shared our specific suggestions for this both privately with policy makers and on our blog.”
The confidence threshold is a setting that Rekognition customers can adjust based on how accurate they want the match to be. If, for example, a Rekognition user wanted to winnow a field of 1,000 images to 100, she might use a lower confidence threshold. Amazon recommends a 99 percent accuracy threshold for use cases where civil rights might come into play, like law enforcement. Amazon says the ACLU deliberately used a low threshold to achieve its political agenda.
But Shankar Narayan, Technology and Liberty Project Director for the ACLU of Washington says the organization would take issue with facial recognition even if it were flawless.
“Even perfectly accurate face surveillance systems are dangerous, supercharging the government’s ability to track and control people without transparency or accountability,” he told GeekWire.
Narayan laid out the following scenario for how facial recognition could be dangerous in the hands of law enforcement:
“Imagine, for example, a police body camera equipped with the kind of fear and anger detection now included in Amazon’s product, on the basis of which an officer may have to make a split-second decision on whether or not to use deadly force. Given the many studies showing that face surveillance products are biased both for identifying people and gauging their emotions, such an application would be dangerous and would reinforce existing biases in policing.”
The ACLU isn’t the only group sounding the alarm over Rekognition’s new fear-detection abilities. Evan Greer of the digital rights advocacy group Fight for the Future accused Amazon of building “the dystopian surveillance state of our nightmares.”
“Facial recognition already automates and exacerbates police abuse, profiling, and discrimination,” she said in a statement. “Now Amazon is setting us on a path where armed government agents could make split-second judgments based on a flawed algorithm’s cold testimony. Innocent people could be detained, deported, or falsely imprisoned because a computer decided they looked afraid when being questioned by authorities.”
Concerns surrounding facial recognition have already compelled some governments to step in. San Francisco, Somerville, Mass., and Oakland, Calif. have passed laws banning some uses of facial recognition technology.
A bill in Amazon’s home state, Washington, would have implemented new guardrails for facial recognition technology but it died in the state legislature last session. While the bill had support from Microsoft and Amazon, the ACLU fought it, claiming the regulations were too watered-down and permissive.
Amazon says it wants federal regulations to govern the nascent technology but activists say they’re concerned that a federal law would be too weak to protect civil liberties.
Editor’s note: This story has been updated to correct Shankar Narayan’s name.