Site icon Sophos News

Apple apologizes for humans listening to Siri clips, changes policy

Apple has formally apologized for keeping accidentally triggered Siri voice assistant recordings and letting its human contractors listen to them.

From the statement it published on Wednesday:

We realize we haven’t been fully living up to our high ideals, and for that we apologize.

To make it up to us, Apple says it’s making these three changes to its Siri privacy policy. Apple will:

Over the past few months, news has emerged about human contractors working for the three major voice assistant vendors – Apple, Google and Amazon – and listening to us as they transcribe audio files.

We’ve also heard about Microsoft doing the same thing with Skype, Cortana and Xbox, and Facebook doing it with Messenger.

In the case of Messenger, we know that Facebook was getting help from at least one third-party transcription service until it put the work on pause. It may be that Apple was likewise using third parties and now plans to cut them out of the work. I’ve asked Apple for clarification and will update the story if I hear back.

Apple also says its team will work to delete any recording that’s determined to be made by an inadvertent trigger of Siri. Those accidentally made clips were the main source of sensitive recordings, according to The Guardian, which initially reported on the issue.

Update on the upshots

We can add Apple’s apology and privacy changes to the roster of companies that are, more or less, changing their approaches to improving their voice recognition. So far, we’ve got:

Apple said in its statement that it plans to resume grading Siri recordings later this autumn, after it’s included the new opt-in option.

Exit mobile version