Apple has formally apologized for keeping accidentally triggered Siri voice assistant recordings and letting its human contractors listen to them.
From the statement it published on Wednesday:
We realize we haven’t been fully living up to our high ideals, and for that we apologize.
To make it up to us, Apple says it’s making these three changes to its Siri privacy policy. Apple will:
- No longer hang on to audio recordings of users talking with their Siri voice assistants. Instead, it plans to solely rely on computer-generated transcripts to improve its speech recognition capabilities. That work includes identifying mistaken recordings. Unfortunately, as a whistleblower revealed in July, the rate of accidental Siri activations is quite high – most particularly on Apple Watch and the company’s HomePod smart speaker. Siri’s mistakenly hearing a “wake word” has led to recordings of private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters, and more.
- Make analysis of recordings an opt-in function. Apple says it hopes that many people will choose to help Siri improve by learning from the audio samples of their requests, “knowing that Apple respects their data and has strong privacy controls in place.” Those who opt to participate can change their minds at any time, Apple says.
- Guarantee that “only Apple employees” get to listen to audio samples of the Siri interactions. It’s unclear who else Apple was using to analyze transcriptions.
Over the past few months, news has emerged about human contractors working for the three major voice assistant vendors – Apple, Google and Amazon – and listening to us as they transcribe audio files.
We’ve also heard about Microsoft doing the same thing with Skype, Cortana and Xbox, and Facebook doing it with Messenger.
In the case of Messenger, we know that Facebook was getting help from at least one third-party transcription service until it put the work on pause. It may be that Apple was likewise using third parties and now plans to cut them out of the work. I’ve asked Apple for clarification and will update the story if I hear back.
Apple also says its team will work to delete any recording that’s determined to be made by an inadvertent trigger of Siri. Those accidentally made clips were the main source of sensitive recordings, according to The Guardian, which initially reported on the issue.
Update on the upshots
We can add Apple’s apology and privacy changes to the roster of companies that are, more or less, changing their approaches to improving their voice recognition. So far, we’ve got:
- Earlier this month, in the aftermath of media reports, Google and Apple suspended contractor access to voice recordings and the grading program through which those recordings were reviewed. “We are committed to delivering a great Siri experience while protecting user privacy,” it told news outlets at the time. Under its previous policy, Apple kept random recordings from Siri for up to six months, after which it would remove identifying information for a copy that it would keep for two years or more.
- Amazon earlier this month said it will let users opt out of human review of Alexa recordings, though users have to actually go in and, periodically, delete those recordings themselves. Here’s how.
- Facebook said that it had “paused” its voice program with Messenger. It didn’t say if or when it might resume.
- After the reports about Skype and Cortana recordings, Microsoft updated its privacy policy to be more explicit about humans potentially listening to recordings. It’s still getting humans to review that audio, however. The company’s privacy policy now reads…
Our processing of personal data for these purposes includes both automated and manual (human) methods of processing.
Microsoft also has a dedicated privacy dashboard page where you can delete voice recordings.
- As far as the Xbox listening goes, a Microsoft spokesperson told Motherboard last week that the company recently stopped listening to Xbox audio for the most part, but that the company has always been upfront about the practice in its terms of service:
We stopped reviewing any voice content taken through Xbox for product improvement purposes a number of months ago, as we no longer felt it was necessary, and we have no plans to re-start those reviews. We occasionally review a low volume of voice recordings sent from one Xbox user to another when there are reports that a recording violated our terms of service and we need to investigate. This is done to keep the Xbox community safe and is clearly stated in our Xbox terms of service.
Apple said in its statement that it plans to resume grading Siri recordings later this autumn, after it’s included the new opt-in option.
Leave a Reply