Skip to content
Naked Security Naked Security

Apple says NO to iPhone backdoor in terror case

Apple is refusing to comply with a judge's order to unlock an encrypted iPhone used by one of the San Bernardino terrorists.

Apple CEO Tim Cook has issued a bold refusal to comply with a judge’s order requiring the company to unlock an encrypted iPhone used by one of the terrorists responsible for the attack in San Bernardino, California.

The Justice Department had sought a court order to compel Apple to unlock the iPhone 5c belonging to Syed Rizwan Farook, accused of killing 14 people at a holiday party at the Inland Regional Center in San Bernardino on 2 December 2015.

Both Farook and his wife, Tashfeen Malik, were killed in a gun fight with police, but the FBI says it needs access to Farook’s iPhone to determine who he was communicating with prior to the attack.

On Tuesday (16 February), Judge Sheri Pym, of the Federal District Court for the District of Central California, granted the US Attorney’s request for Apple to unlock the iPhone, saying Apple should provide “reasonable technical assistance” to the FBI.

However, Cook contends in an open letter to customers, published yesterday on Apple’s website, that such an order would require Apple to create a new version of the iOS operating system – one with a backdoor.

The type of device Farook was using – an iPhone 5c – and other Apple devices running iOS 8 (or higher), are encrypted by default and cannot be decrypted without the user’s passcode.

Making things even more difficult for investigators, or anyone else attempting to access a locked iPhone without the passcode: entering 10 incorrect passcodes will automatically wipe the device if the “Erase Data” setting has been turned on.

Creating a backdoor to bypass Apple’s encryption would “[threaten] the security of our customers,” Cook says, and a backdoored version of iOS – which “does not exist today” – would be “too dangerous to create”:

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software – which does not exist today – would have the potential to unlock any iPhone in someone’s physical possession.

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

Cook’s position is wholly consistent with his past statements on backdoors, and Apple has refused to obey a court order in at least two other cases where it said it could not comply because it could not break its own encryption.

However, Apple has been willing to cooperate with law enforcement in general and in the San Bernardino case specifically.

According to Cook’s letter, Apple complies with “valid subpoenas and search warrants,” and has provided data “in our possession” to the FBI when asked.

Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.

Apple may be able to turn over data stored unencrypted in iCloud, for example, as it reportedly did in a criminal case last year when the FBI requested access to encrypted communications between suspects in a drug case.

However, accessing data stored on an encrypted iPhone would require Apple’s engineers to build the equivalent of a “master key,” according to Cook.

And once a master key that can unlock any iPhone is created, there’s no way to guarantee that knowledge of how to exploit such a backdoor would not get out and threaten the security of every Apple user.

Ultimately, that’s too great a risk, Cook says:

While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.

This latest court battle represents another major setback for law enforcement in the ongoing “crypto wars.”

FBI Director James Comey has repeatedly said that law enforcement investigations are thwarted by criminals and terrorists “going dark” through the use of encrypted devices and apps.

Lawmakers from the US Congress, on down to state legislators in California and New York, have proposed laws compelling technology companies to create backdoors for law enforcement access.

But, so far, the tech companies and privacy advocates are winning the fight against backdoors that would weaken encryption.

The fate of unbreakable encryption, and our collective security, may depend on how Apple’s legal fight plays out.

SOPHOS STATEMENT ON ENCRYPTION

Our ethos and development practices prohibit “backdoors” or any other means of compromising the strength of our products for any purpose, and we vigorously oppose any law that would compel Sophos (or any other technology supplier) to weaken the security of our products.

Full statement ►

Image of Tim Cook courtesy of JStone / Shutterstock.com.

18 Comments

Good for Apple and good for Sophos. While I realize the importance of the data that may or may not be on the iPhone in question I moreso realize the importance of both Apple and Sophos living up, or at least attempting to adhere to the nth degree, their oath of maintaining the highest level of security in their products. Apple and Sophos have put the security of millions of users worldwide both now and in the future ahead of a what-if scenario and for that I applaud them.

Reply

Good for Apple! What is it that law enforcement wants to know, anyway? What do they expect to find that they couldn’t discover in other ways. The crime has already been committed.

Reply

You seriously can’t imagine how the information stored on a smartphone would be useful to law enforcement during an investigation? This isn’t about “big government” trying to do evil. Stupid maybe, but not evil. I don’t think they’re suggesting being able to do this without jumping through the normal legal hoops. The problem is it has too much potential to be abused and put others at risk.

Reply

While I agree that the information obtained would probably be of some use, and I also agree that the government isn’t likely asking for blanket access in the future, I don’t think abuse by unscrupulous individuals is the only concern here. The government has shown through all of the Snowden revelations that it is more than happy to create a secret court and secret NDA letters in an effort to half-heartedly “police” itself. Give them the keys this time around and they’ll never stop asking (or potentially abusing) whenever it suits them, unfortunately.

Reply

John,

Several articles circulating refer to a firmware flash that would disable the password input delay on multiple failed attempt and the data erasure after 10 consecutive failed attempts. This would allow The FBI to brute force The iPhone.

The main barrier is that Apple would need to sign the firmware.

Can Sophos confirm this and, if so,what are your thoughts?

Thank you for your time and consideration,

Reply

You can read the court documents, including what the FBI are after at a minimum, from the link in the article.

Quite how Apple could/would do that if they wanted to isn’t clear, nor is the issue of whether there’s a middle way in which the “cracking” would be done in a neutral location and all the FBI would get is the passcode.

I guess part of the problem is that Apple would have to hand the phone back afterwards…whether this would leave behind some re-usable “backdoor” code I don’t know…

Reply

Even if Apple could produce what the police want without giving anyone outside Apple anything useful, the “backdoor” software would exist at Apple and would be a rich target for hackers or unhappy employees. Even if kept secure it would be a rational for the police (local, state, federal, international, Chinese – how would you rationalize denying the Chinese???) to come back in the future with another iPhone to unlock. Even if Apple destroyed the software after this current incident, the knowledge would still be out there and someone would eventually recreate it.

Reply

I hear you. But you can argue that this problem simplifies to “Apple has to sign the code”, and *that’s* the bridge the company is not willing to cross, for the #nobackdoors reasons Tim Cook gave.

In other words, the crown jewels are already there inside Apple to server as a rich target, namely the private key material for code signing.

Reply

It seems like the ideal solution would be to ensure future generations of iPhone either have the encryption key embedded with the firmware so that it is removed when the phone is flashed or adding a safeguard so the phone cannot be flashed when the locked…if that is feesible. Smart phone architecture is not my forte.

The interesting part will be what happens when Apple tries to implement that solution in future products now that the federal courts are trying to set precedence.

Reply

i will let an American Founding father speak for me.
Ben Franklin Quotes. “They that can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” ”Those Who Sacrifice Liberty For Security Deserve Neither.” ”He who would trade liberty for some temporary security, deserves neither liberty nor security.”

Reply

why wouldn’t they go after the network to get a log of who they have been calling and receiving calls from? Even if there was a back door for the Iphone i wouldn’t consider this a valid case to use it. this highlights EXACTLY why such a back door is a bad idea. These investigators need to get there act together. searching someone’s private information because it MIGHT have some evidence is just lazy police work.

Reply

I believe it is because they are using this to “prove” to the public that encryption is evil and they must have built-in access to do their jobs. In other words, I believe it is a public relations maneuver using a tragedy in an attempt to get the public to demand back doors so politicians will have to respond with legislation. Rahm Emanuel has been quoted as saying something along the lines of “Never let a good crisis go to waste” and this is more evidence of that mentality.

Reply

Data they can get without accessing the phone itself: Call history (from the Telcos), Browser search history (if google was used), Possibly location history.
What they don’t get: Locally stored data and encrypted cloud storage.
If I was an FBI guy, I would start with the call history, interview and check data storage access after persuading by a warrant to unlock their phone/PCs themselves. All of which I expect they did within hours of identifying the killer.
I see this as only an excuse for a backdoor to everyone’s equipment.

Reply

From what I’ve been reading, it probably is possible for Apple to write code in an iPhone software upgrade that would ignore the switch to erase the contents of the phone after 10 failed attempts. If it is possible and it is possible for some one to make a profit from it, it will probably happen. If those 2 premises are correct, what is the best way for the feds and for Apple to move forward with this?

Reply

The Feds have thrown up a thinly veiled “Red Herring” in that there is very likely Zero Information on that phone. After all, the Jihadist perps destroyed both of their personally owned cell Phones and destroyed any computer storage they had. Obiously the County owned cell phone held no interest to the perps themselves. The goal of the feds is all about forcing Apple to provide a backdoor or at least, establishing precedence.

Reply

“SOPHOS STATEMENT ON ENCRYPTION
Our ethos and development practices prohibit “backdoors” or any other means of compromising the strength of our products for any purpose, and we vigorously oppose any law that would compel Sophos (or any other technology supplier) to weaken the security of our products.”

Well, after David Cameron have it’s say on this, lets see how much your ethos really will be worth in the future ;)

Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!