Amazon defended the law enforcement use cases for its facial recognition technology Friday in an effort to throw water on fears raised by civil rights activists.
Matt Wood, a leader on the Amazon Web Services machine learning team, published a blog post in response to criticism from the American Civil Liberties Union and other advocacy groups, which have been demanding the company stop selling its Rekognition software to police. In the post, Wood cautions that we “should not throw away the oven because the temperature could be set wrong and burn the pizza.”
Wood expressed skepticism about an experiment the ACLU conducted using Rekognition to compare headshots of the members of Congress with a database of 25,000 mugshots. The test set Rekognition to make matches with an 80 percent confidence rating, according to Amazon. The ACLU says Rekognition incorrectly matched 28 members of Congress to people pictured in arrest photos.
“The false matches were disproportionately of people of color, including six members of the Congressional Black Caucus,” the ACLU said in a blog post.
In Wood’s response, he says Amazon recreated the ACLU experiment comparing photos from members of Congress to a database of 850,000 faces with a 99 percent confidence threshold. Amazon says it saw a 0 percent misidentification rate, “despite the fact that we are comparing against a larger corpus of faces.”
“The default confidence threshold for Rekognition is 80%, which is good for a broad set of general use cases (such as identifying objects, or celebrities on social media), but it’s not the right one for public safety use cases,” Wood wrote. “The 80% confidence threshold used by the ACLU is far too low to ensure the accurate identification of individuals; we would expect to see false positives at this level of confidence.”
Amazon recommends a 99 percent confidence rating for use cases like law enforcement, where accurate facial recognition is critical. An Amazon spokesperson told GeekWire that “Rekognition is almost exclusively used to help narrow the field and allow humans to expeditiously review and consider options using their judgment (and not to make fully autonomous decisions), where it can help find lost children, restrict human trafficking, or prevent crimes.”
Update: The ACLU issued the following statement in response to Amazon’s blog post:
In its five stages of grief over its dangerous face surveillance product, Amazon is clearly stuck at denial. In a matter of 48 hours, Amazon has gone from its own system default of an 80 percent match rate to saying yesterday it should be 95 percent, and then saying today it should be 99 percent. At no time has Amazon taken any responsibility for the very grave impact that their face surveillance product has on real people.
Instead, Amazon is grasping at straws in an attempt to distract from critical civil rights issues. Amazon should take steps to fix the damage its ill-advised face surveillance product may have already caused and to prevent further harm. Amazon should respond to members of Congress. It should disclose every government agency that has already purchased this technology. And it should heed the calls of organizations and its own customers, employees, and shareholders and stop selling face surveillance to the government. The fact that Amazon has refused to address the very real threats its technology poses, let alone take these necessary actions, is further evidence of its disappointing state of denial – and the need for Congress to quickly step in with a moratorium.
The ACLU began sounding the alarm about police use of Rekognition in May, claiming that the technology can amplify racial biases. In June, leaders of various civil and immigrant rights groups delivered 150,000 signatures to Amazon’s Seattle headquarters from people demanding Amazon to stop selling Rekognition to police.