Site icon Sophos News

Google and Apple suspend contractor access to voice recordings

Apple and Google have announced that they will limit the way audio recorded by their voice assistants, Siri and Google Assistant, are accessed internally by contractors.

Let’s start with Apple.

Apple’s privacy hump began a week ago when The Guardian ran a story revealing that contractors “regularly hear” all sorts of things Apple customers would probably rather they didn’t, including sexual encounters, business deals, and patient-doctor chats.

Despite Apple’s protestations that such recordings are pseudonymised and only accessed to improve Siri’s accuracy, the whistleblower who spoke to the newspaper was adamant that in some cases:

These recordings are accompanied by user data showing location, contact details, and app data.

Apple now says it has suspended the global programme under which voice recordings were being accessed in this way while it conducts a review.

It’s not clear how long this will remain in force, nor whether the company will adjust the time period it keeps recordings on its servers (currently between six months and two years).

By interesting coincidence, Google finds itself in a similar fix. Germany’s privacy regulator recently started asking questions after Belgian broadcaster VRT ran a story last month on contractors listening to Google Assistant recordings. Google’s privacy fig leaf:

We don’t associate audio clips with user accounts during the review process, and only perform reviews for around 0.2% of all clips.

Nevertheless, Google now says it has also suspended access to recordings in the EU for three months.

It was Amazon which started this ball rolling in April when a Bloomberg report reported that revealed that – yes – recordings stored by its Alexa voice assistant were being accessed by contractors.

Spot the pattern?

There are a tangle of issues here, the first of which is the way these companies explain how they store and access voice recordings.

They say they only access fractions of a percent of the recordings stored on their servers, but that could still be a lot of recordings. Despite what companies say, it’s also not clear that these recordings are always as anonymised as they claim.

Perhaps the fundamental issue is that the only way to improve the accuracy of voice assistants is to manually tune how they understand what users are asking them to do, or not to do.

That requires company staff – including contractors – to access real voice interactions taken from a growing range of devices, including smart speakers, smartphones, and Apple’s Watches.

It’s an inherent part of developing this type of device and there’s no obvious way around an issue that was always likely to catch tech companies out at some point.

The most likely response from Apple, Google and Amazon is some kind of re-drafting of how they explain all of the above to a public that is growing more sceptical about the ethics of grabbing lots of personal data on the assumption that companies will pay attention to privacy.

Exit mobile version