Apple has long been either a privacy hero or headache, depending on which side of the divide you sit.
And based on the beta versions of iOS 11, which gets its grand, official introduction on Tuesday at the Apple Special Event, along with the newest iPhone, iWatch and more, it will likely now be even more of a hero to privacy advocates and more of a headache to law enforcement.
Not that Apple has ever made it easy for the cops and intelligence services. Nicholas Weaver, a security researcher at the University of California, Berkeley, noted on the Lawfare blog last week that, “unlike Google or Facebook, which use advertising to extract value from users’ personal information, Apple focuses on selling things that protect a user’s data from all unauthorized access – including by Apple”.
That led, among other things, to the famous clash last year between Apple and the FBI over the agency’s demand that the company provide a way to unlock the iPhone of the deceased San Bernardino terrorist.
While that was “resolved” when the FBI “bought a tool”, according to former FBI director James Comey, it didn’t resolve the overall conflict over whether device makers like Apple should be required to provide a backdoor into their products for law enforcement.
And that conflict is likely to get more intense, now that iOS 11 is increasing protections against “unauthorized access”.
Until now, once an iPhone was unlocked – and law enforcement could require a person to use the Touch ID feature to do so without running afoul of the Fifth Amendment – there was no further barrier to, as Weaver put it, “connect the device to a computer running forensics software, or even just iTunes, direct the device to ‘trust’ the new computer when prompted, and download a backup that contains almost all of the relevant information stored on the phone”.
All of which, relevant or not, they could then analyze for as long as they wished, back at the office.
No more. The new iOS will now require the six-digit passcode before allowing it to sync with – or “trust” – a different computer. And giving up that number does have Fifth Amendment protection. Greg Nojeim, director of the Project on Freedom, Security and Technology at the Center for Democracy & Technology, said speaking the passcode is considered “testimonial”, while providing a fingerprint is not.
So law enforcement could still manually browse through what they can find on the unlocked phone, but that amount of data will be vastly less than what they could gather from a backup and forensic software using an SQLite database engine, which would in most cases include thousands of deleted messages and call logs.
Weaver said the passcode requirement would be especially significant at border searches, where a legal “exception” allows US Customs and Border Protection to copy all the contents of electronic devices without any probable cause or even “reasonable articulable suspicion.”
Again, while agents would still be able to demand that an owner unlock an iPhone and then manually look through it, they would not be able to make a backup copy without the passcode.
Nojeim applauded the impending change.
We have long said that there has to be reasonable suspicion to access everything on a phone. These devices carry your life – they’re a treasure trove of private information.
In addition to the passcode barrier, iOS 11 also provides an “SOS” feature – press the power button five times rapidly and it will let the user make an emergency call, but also disables the fingerprint reader. To unlock the phone would then require the passcode. The feature is, of course, being sold as a way to get help quickly in an emergency, but it obviously could be used to lock the phone down to prevent law enforcement access.
And, as software forensic firm Elcomsoft noted in a blog post last week, law enforcement can’t tell if a potential suspect used that feature to disable Touch ID:
There is no way to tell that Touch ID has been disabled by using the SOS feature. Once the sequence is completed and the user cancels the menu, the iPhone prompts for a passcode in the same manner it uses after Touch ID naturally times out.
Weaver doesn’t see that as a big deal, saying:
There are already a number of ways to rapidly disable the fingerprint reader, such as powering off the phone, using the wrong finger four times, or just waiting long enough for the feature to disable itself. So this is more hype than substance.
Even a locked iPhone doesn’t lock everything out, as Naked Security’s Maria Varmazis noted when she took the beta iOS 11 for a test drive. In fact, it actually allows a bit more access than iOS 10:
iOS 11 adds viewing the Control Center (the menu that you can pull up from the bottom of the screen) and returning missed calls to options that work despite the lockscreen, in addition to features that were already available on iOS 10. All of these options are turned on by default.
Of course, a user can turn them off as well. But the bottom line is that the personal privacy vs protection-of-society debate is likely to get more intense, and make its way into the courts.
Comey, back in March when he was still FBI director, said at a conference in Boston that while “I love privacy,” there has always been a “bargain” in the US that government can invade privacy, “with probable cause and a warrant … The general principle is that there is no such thing as absolute privacy.”
Weaver would agree only in part. He wrote that the iOS 11 upgrades “will have some impact on lawful investigations”. But he added: “That isn’t necessarily a problem – the benefits here outweigh the costs.”
Nojeim agreed with that last part, saying:
We are in the golden age of surveillance. There has never been larger or richer collection of data about the private activities and thoughts of people who have committed no crime and done nothing to bring suspicion. Something like this starts to level the playing field just a bit.
JD
“the personal privacy vs protection-of-society debate”
Except, there isn’t one. Sacrificing privacy doesn’t protect society, it endangers the individual. Also, “protection of society” smacks loudly of “protection of the status quo,” which rarely leads to positive result – the German government circa 1936 being a prime example of the far more likely negative results.
Brian T. Nakamoto
Under US law, can one be compelled to use Face ID on the new iPhone X?
Paul Ducklin
Can’t see how. There no law forcing you to use it on other devices with that sort of feature. (Samsung has talked up its version of face-based identification for some time already.)
Brian T. Nakamoto
I guess the key for anyone concerned about having Face ID or Touch ID used against their will is to disable them if one gets the opportunity to do so before their device is seized. Kudos to Apple for providing button shortcuts to quickly disable them.
Max
“speaking the passcode is considered “testimonial”, while providing a fingerprint is not” That distinction is so illogical to me. And if “speaking” is the only thing considered testimonial, then how about not asking for the passcode but rather ordering them to type it in (without looking if need be)? What is the difference to tell someone “put your finger on here” and “type in your code here” in that context? Both actions unlock the phone. If either is legal/illegal, both should be, whichever it is. What if I use the wrong finger on purpose? Is revealing what finger is the one that unlocks it considered testimonial? So silly.
Paul Ducklin
When I look back through Naked Security at our coverage over the years of fourth and fifth amendment issues relating to passcodes and data searches, the one thing that strikes me (this is US-specific, considering that the “fourth-and-fifth” issues are specific to the US Bill of Rights) is that US jurisprudence ebbs and flows, and that there simply isn’t agreement throughout the many parts of the US judiciary about how the Bill of Rights applies to things like decryption and passwords, and what constitutes testimony
What complicates the issue yet further – to me, at any rate, given that IANAL – is exactly the point you make: *using* the passcode doesn’t sound “testimonial” at all if its only purpose is to unlock the data. In particular, if the court orders you to submit to search and seizure of your data, you don’t have to “say your password out loud” (or even to reveal it by writing it down) in order to unlock the data.
For example, if a court issued a search warrant for a specific cupboard (closet) in your house, and the key to that closet also opened others in your house that weren’t covered under the warrant, I assume you’d be within your rights to refuse to hand over the key, but would be in contempt of the court if you refused to open the door to which the warrant referred, given that you could easily do so.
I agree that some sort of consistency is surely needed here. Things could get very silly otherwise. For example, if you can’t force me to “testify” my password by speaking it, what if you ask me to point to it on a printed list instead – that could be a long, long list if either of us wanted to prove a point.
Max
I do get that the general topic of opening digital locks might be controversial among judges and such. What I don’t get is the contradicting inconsistency within that topic. You can argue whether all digital locks are protected, or none of them are. There really shouldn’t be anything in between.
Anonymous
It’s not the “lock” that are at issue here in terms of the Firth.
The court could compel you to open the lock, but they can’t compel you to reveal what’s in your head to self-incrimination.
It’s the right against forced self-incrimination which protect you from giving up the passcode in your head, that itself have nothing to do with what it could use for it.
In the case of your house, let say the police have warrant to search your house and everything that in it. They have to rights to ask you to open the door, give them the key, or kick down your door. Inside the house, there are two safe, one with key to open, and the other use passcode. They could do the same thing to your key-safe same as your door. However, with the passcode-safe, the only option is to crack your safe. They can’t compel you to give up the passcode in your head.
Does that make more sense now?
Max
The information which key actually unlocks a certain lock and where that key is is also in my head. By compelling me to unlock a lock with a key, they are inherently compelling me to reveal where the key to that lock is. There is no difference between a passcode and a key that they don’t have. Both requires information on my part, whether it be implied information or explicit information. If the fifth amendment protects you from revealing any self-incriminating information than revealing the right key and it’s location would be protected as well. If that is not protected because you don’t have to show them where it is, you have to just unlock the safe, then typing in a code (without revealing it) would be the same. Exactly that is the nonsensical contradiction in this situation. Either all of it is protected, or none of it is. Anything in between doesn’t make sense in this case.
Paul Ducklin
Does the Fifth Amendment really “protect you from revealing any incriminating information”? My understanding is that it says (amongst other stuff like double-jeopardy and the confiscation of property) that no one shall be “compelled in any criminal case to be a witness against himself”.
That is a very different legal kettle of fish than “giving you the right to suppress evidence”, which is more where you are going.
Max
If not revealing a physical key is considered “suppressing evidence” then not revealing a code should be the same. Lets make the example even more clearly absurd: You have a piece of paper in your house with the code written on it. So now they can compel you to get that piece of paper and reveal it to them, but they can’t compel you to tell them the code from memory. They are asking for the same information, yet in one case it is considered being a “witness against oneself” and therefor protected, and in the other case it’s not?
Paul Ducklin
My understanding is that something you already wrote down is *not* protected by the Fifth Amendment, and that’s exactly as the Founding Fathers intended it to be in the Bill of Rights.
It’s no good complaining to me that it’s absurd – you need to write to your Congressperson :-)
Max
My point is you can compel people to reveal certain information (location of a key, what finger unlocks device, paper with code on it,…) but can’t compel them to reveal other information. Even if the information revealed that way is essentially the same. I don’t think the Fifth Amendment is the problem, I think the interpretation of it is. It’s fine if the code itself isn’t protected if I wrote it down, but the fact that I wrote it down or where the piece of paper is should be as protected as the code in my mind.
Anonymous
The difference is the knowledge. If they have the physical key they can use it to unlock the cabinet. If they find the key on you, they can use the key. If you have hidden the key, you can’t be forced to tell them where it is. They’d be forced to try to crack the lock on their own. Translated to phones, the physical key could be your finger or face and they can use the physical key because it’s present. Passcode is the equivalent of hiding the key and they can’t force you to tell them where it is.