Amazon became the latest tech giant to let users of its voice assistant opt out of human review of their voice recordings, after similar announcements from Apple and Google.
The move Friday afternoon followed revelations from Bloomberg and others about an Amazon team consisting of thousands of people who listen to Alexa voice recordings as part of a program designed to improve the company’s voice assistant.
It’s the latest sign of growing public awareness of the listening and recording capabilities of Amazon Echo speakers and smart home devices from other tech companies.
Amazon rolled out the change Friday in the settings of the Alexa app. Previously, users were able to change a privacy setting to prevent the company from using voice recordings to help develop new Alexa features. Now, that same opt-out also lets users prevent humans from listening to the recordings to improve existing Alexa features. Here’s the company’s statement on the issue.
“We take customer privacy seriously and continuously review our practices and procedures. For Alexa, we already offer customers the ability to opt-out of having their voice recordings used to help develop new Alexa features. The voice recordings from customers who use this opt-out are also excluded from our supervised learning workflows that involve manual review of an extremely small sample of Alexa requests. We’ll also be updating information we provide to customers to make our practices more clear.”
The opt-out is accessible by going to the privacy settings under the menu in the Alexa app, then selecting “Manage How Your Data Improves Alexa.” Here’s what the setting looks like, including the new language about opting out of “manual review,” aka people listening to what you say.
Amazon didn’t address GeekWire’s question about whether it has been contacted by regulators regarding human review of voice recordings. The changes come amid heightened government scrutiny of tech giants, by the U.S. Justice Department and others, over issues including privacy and competition.
Apple made a similar announcement this week after the Guardian reported that contractors reviewing Siri recordings for quality control “regularly hear confidential medical information, drug deals, and recordings of couples having sex.” Apple says it’s working on a feature to let Siri users opt out of the human review, and says it has suspended the program in the meantime.
Along the same lines, Google said it “paused” human review of Google Assistant recordings after a contractor leaked more than 1,000 recordings to VRT News in Belgium.