Skip to content
Naked Security Naked Security

Listening in: Humans hear the private info Siri accidentally records

Apple Watch and HomePod have the highest rate of inadvertent recordings, a whistleblower says.

Have you ever asked Apple’s personal voice assistant, Siri, if it’s always listening to you?

If so, you presumably got one of its cutesy responses. To wit:

I only listen when you’re talking to me.

Well, not just when you’re talking to Siri, actually. These voice assistant devices get triggered accidentally all the time, according to a whistleblower who’s working as a contractor with Apple.

The contractor told The Guardian that the rate of accidental Siri activations is quite high – most particularly on Apple Watch and the company’s HomePod smart speaker. Those two devices lead to Siri capturing the most sensitive data out of all the data that’s coming from accidental triggers and being sent to Apple, where human contractors listen to, and analyze, all manner of recordings that include private utterances of names and addresses.

The Guardian quoted the Apple contractor:

The regularity of accidental triggers on [Apple Watch] is incredibly high. The watch can record some snippets that will be 30 seconds – not that long, but you can gather a good idea of what’s going on.

The whistleblower says there have been “countless” instances of Apple’s Siri voice assistant mistakenly hearing a “wake word” and recording “private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters” and more. Those recordings often reveal identifying information, they said:

These recordings are accompanied by user data showing location, contact details, and app data.

If you aren’t muttering, “So, what’s new?” by this point, you haven’t been paying attention to the news about how much these devices are overhearing and how little the vendors are worrying about the fact that it’s a privacy violation.

Over the past few months, news has emerged about human contractors working for the three major voice assistant vendors – Apple, Google and Amazon – listening to us as they transcribe audio files. As a series of whistleblowers have reported, Google Assistant, Amazon Alexa and Siri have all been capturing recordings that are often unintentionally recorded by owners after their devices get triggered by acoustic happenstance: word sound-alikes, say, or people chattering as they pass by in the street outside.

Accidental recordings: “technical problem” or “privacy invasion?”

It’s all done to improve the vendors’ speech recognition capabilities, and identifying mistaken recordings is part of that. However, the whistleblower said, Apple instructs staff to report accidental activations “only as a technical problem”, with no specific procedures to deal with sensitive recordings. The contractor:

We’re encouraged to hit targets, and get through work as fast as possible. The only function for reporting what you’re listening to seems to be for technical problems. There’s nothing about reporting the content.

All the big voice assistant vendors are listening to us

First, it was whistleblowers at Amazon who said that human contractors are listening to us. Next it was Google, and now Apple has made it a trifecta.

Earlier this month, Belgian broadcaster VRT News published a report that included input from three Google insiders about how the company’s contractors can hear some startling recordings from its Google Assistant voice assistant, including those made from bedrooms or doctors’ offices.

With the help of one of the whistleblowers, VRT listened to some of the clips. Its reporters managed to hear enough to discern the addresses of several Dutch and Belgian people using Google Home, in spite of the fact that some of them never said the listening trigger phrases. One couple looked surprised and uncomfortable when the news outlet played them recordings of their grandchildren.

The whistleblower who leaked the Google Assistant recordings was working as a subcontractor to Google, transcribing the audio files for subsequent use in improving its speech recognition. He or she reached out to VRT after reading about how Amazon workers are listening to what you tell Alexa, as Bloomberg reported in April.

They’re listening, but they aren’t necessarily deleting: in June of this year, Amazon confirmed – in a letter responding to a lawmaker’s request for information – that it keeps transcripts and recordings picked up by its Alexa devices forever, unless a user explicitly requests that they be deleted.

“The amount of data we’re free to look through seems quite broad”

The contractor told the Guardian that he or she went public because they were worried about how our personal information can be misused – particularly given that Apple doesn’t seem to be doing much to ensure that its contractors are going to handle this data with kid gloves:

There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad. It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental trigger – addresses, names and so on.

Apple is subcontracting out. There’s a high turnover. It’s not like people are being encouraged to have consideration for people’s privacy, or even consider it. If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings].

The contractor wants Apple to be upfront with users about humans listening in. They also want Apple to ditch those jokey, and apparently inaccurate, responses Siri gives out when somebody asks if it’s always listening.

This is the response that Apple sent to the Guardian with regards to the news:

A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements

Apple also said that a very small, random subset, less than 1% of daily Siri activations, are used for “grading” – in other words, quality control – and those used are typically only a few seconds long.

For its part, Google also says that yes, humans are listening, but not much. Earlier in the month, after its own whistleblower brouhaha, Google said that humans listen to only 0.2% of all audio clips. And those clips have been stripped of personally identifiable information (PII) as well, Google said.

You don’t need an Apple ID to figure out who’s talking

The vendors’ rationalizations are a bit weak. Google has said that the clips its human employees are listening to have been stripped of PII, while Apple says that its voice recordings aren’t associated with users’ Apple IDs.

Those aren’t impressive privacy shields, for a few reasons. First off, Big Data techniques mean that data points that are individually innocuous can be enormously powerful and revealing when aggregated. That’s what Big Data is all about.

Research done by MIT graduate students a few years back to see how easy it might be to re-identify people from three months of credit card data, sourced from an anonymized transaction log, showed that all it took was 10 known transactions – easy enough to rack up if you grab coffee from the same shop every morning, park at the same lot every day and pick up your newspaper from the same newsstand – to identify somebody with a better than 80% accuracy.

But why get all fancy with Big Data brawn? People flat-out utter names and addresses in these accidental recordings, after all. It’s the acoustic equivalent of a silver platter for your identity.

Getting hot and sweaty with your honey while wearing your Apple Watch, or near a HomePod? Doing a drug deal, while wearing your watch? Discussing that weird skin condition with your doctor?

You might want to rethink such acoustic acrobats when you’re around a listening device. That’s what they do: they listen. And that means that there’s some chance that humans are also listening.

It’s a tiny sliver of a chance that humans will sample your recordings, the vendors claim. It’s up to each of us to determine for ourselves just how much we like those vague odds of having our private conversations remain private.

8 Comments

In Future news: Land mark court case approves automatic permanent search warrants for past, present and future data (including Audio, Video, and other files) if a person is suspect of a crime. People become a suspect when an analytics AI detects specific keywords. Main stream media reports: Civil rights activist call this the best improvement in security ever. Stating: “if you ever joked about a murder, and that person dies, we have a lead” Police will be able to break into your house and stop domestic abuse before it goes to far”
Opponents worry that TV shows and movies will trigger false warrants, defenders argue those racist bigots aren’t thinking of the children.
Regulators were quoted as saying “This was the goal back in 2022 when enacting the law requiring all subjects to keep and maintain a approved data Device” – cartoon image of protester wearing phone on ankle –

Reply

Isn’t is time that we all look after ourselves. Big Business doesn’t look after people, they use people for their own ends. Why have all these listening devices turned on when they are not being used. Turn them off and put a piece of tape over the microphone and camera. Marc Zuckberg does for a very good reason. On your phone turn off Siri and turn it on only when you want to ask Siri something. Why anyone wants to believe anything a machine tells them is still beyond my comprehension. Sure it looks like “neat stuff” on Star Trek but even there Computer has been corrupted in some episodes and put the Enterprise in jeopardy many parsecs from help. There is an off switch – use it. The danger may be that the off switch doesn’t do anything. Let’s not even imagine that kind of conspiracy exists (I think it does.)

Reply

this is a complete invasion of privacy, since when did it become legal to record and save audio !!!

Reply

It became legal when J. Edgar Hoover said so. That was many years ago. Fewer and fewer people remember that.

Reply

If you scroll down to the end, past the pages of legalese in the User Agreement and don’t read it, and just click “OK” or “Install,” – then it’s perfectly legal.

Reply

Just do not trust Siri or the fruit… ever.
Also goes for goo, ms and any other company that offers this “Feature”.

Reply

When stuff like this finally drops thanks to one whistle blower or another, there is a knee jerk reaction by a health portion of the people that willingly seek out the hardware, technology, and services from Tech firms that have ALWAYS included some means of informing the customer a product will be collecting user data in some fashion by default … read the fine print, trust me there isn’t a EULA or SLA, or any other legal disclaimer that a team of lawyers made sure was as iron clad as possible when it comes to a companies revenue streams being effected by customers that honestly believe they are “always right”.

There is one golden rule of the internet that I fear most people are willingly ignorant of, Nothing on the internet is free, if you don’t pay for product/service, than you are the product/service the provider is selling.

Reply

Is there a possibility that Siri listens for the trigger even when the screen is off or on the lock screen when explicitly disabled from functioning while locked?

Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!