Photo by Tyler Lastovich from Pexels

Siri won’t be so sneaky about snooping anymore.

That’s the gist of a new policy announcement and accompanying apology from Apple this morning. It’s a move that could pressure its fellow tech giants Amazon and Google to follow suit.

Apple says it will no longer retain audio recordings of users interacting with its Siri voice assistant unless they opt in. And when they do, only Apple’s own employees, not contractors, will review the audio samples as part of the company’s efforts to monitor and improve the quality of Siri’s responses.

The announcement follows a series of revelations and reports about Apple, Google and Amazon assigning teams of people, in some cases contractors and not direct employees, to review audio clips of their users interacting with their voice assistants, unbeknownst to those users. The resulting outcry over the privacy invasions led each company to reconsider their policies. Both Apple and Google have put their practices of human review on hold pending reviews.

Amazon addressed the issue previously by giving users the ability to opt out of voice recording and “manual review” of their interactions with its Alexa voice assistant, while still subtly discouraging users from taking that step by warning them that “voice recognition and new features may not work well” for them if they take that step.

Apple, with this morning’s announcement, goes further by saying it will no longer retain audio recordings by default, instead requiring users to opt in if they want to participate. Here is Apple’s summary of the changes it’s planning to make.

As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize. As we previously announced, we halted the Siri grading program. We plan to resume later this fall when software updates are released to our users — but only after making the following changes:
  • First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
  • Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
  • Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.
Apple is committed to putting the customer at the center of everything we do, which includes protecting their privacy. We created Siri to help them get things done, faster and easier, without compromising their right to privacy. We are grateful to our users for their passion for Siri, and for pushing us to constantly improve.

Apple suspended the program after the Guardian reported in June that contractors reviewing Siri recordings for quality control regularly heard “confidential medical information, drug deals, and recordings of couples having sex.”

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.