Skip to content
Naked Security Naked Security

FBI might have a way to unlock shooter’s iPhone without Apple’s help

The government asked a judge to delay deciding whether Apple must build an iPhone backdoor while the FBI tests its own backdoor.

The FBI may have found a way to break into the locked iPhone at the center of the US government’s court battle with Apple.

Apple and the US Department of Justice (DOJ) were scheduled to appear in federal court today, for a judge to decide whether Apple must assist the FBI in breaking into the iPhone that belonged to one of the shooters in last December’s terrorist attack in San Bernardino, California.

But in a surprising move yesterday, the DOJ filed a motion to delay the court hearing until 5 April, in order to give the FBI more time to test a newly discovered method for unlocking the iPhone.

In the DOJ’s motion, the government said that an unnamed third party came forward on Sunday (20 March) to demonstrate for the FBI a “possible method” for unlocking the iPhone of San Bernardino shooter Syed Rizwan Farook, who was killed in a gun fight with police after the attack.

Farook’s passcode-protected iPhone has iOS 9, which includes a security feature that erases all data on the device after 10 failed passcode guesses.

However, if the method revealed to the FBI can get around the passcode protection in iOS 9 without erasing the iPhone’s data, it would “eliminate the need for assistance from Apple,” the government’s motion said:

US motion in Apple iPhone case

On Sunday, March 20, 2016, an outside party demonstrated to the FBI a possible method for unlocking Farook’s iPhone. Testing is required to determine whether it is a viable method that will not compromise data on Farook’s iPhone. If the method is viable, it should eliminate the need for the assistance from Apple Inc. (“Apple”) set forth in the All Writs Act Order in this case.

Last month, Judge Sheri Pym ordered Apple to create special software that would disable the limitation on passcode guesses and the auto-erase feature, allowing the FBI to make unlimited guesses of Farook’s passcode until finding the right combination to unlock the device.

Apple appealed, arguing that to do so would create a “backdoor” that could potentially jeopardize the security of every iPhone.

If the FBI is able to unlock the iPhone without Apple’s assistance, it could end this particular court battle but it would leave the central disagreement about encryption backdoors unresolved, and raise a number of new and very discomforting issues.

Firstly, it would tip the scales in favor of law enforcement without the issue being tested in court.

Secondly, it would suggest that there is a method for cracking iPhone encryption or rate limiting, perhaps a 0-day exploit, that’s in-the-wild and unknown to Apple.

If the FBI is in possession of a 0-day exploit and it hands over details of that exploit to Apple, then it won’t be able to use it again in future – improving the security of private citizens and criminals alike but robbing the FBI of a capability it clearly thinks is important.

If the FBI doesn’t give the details to Apple, it could leave iPhones with a de-facto backdoor with unknown provenance that at least one “outside party” has already discovered and which anyone else could find, abuse or sell on.

Whatever the outcome, it surely won’t put an end to the feud between Apple and the US government over iPhone encryption.

Apple CEO Tim Cook has forcefully opposed the government’s demands for what he calls a backdoor into Apple’s products.

Cook addressed the backdoors issue at yesterday’s Apple town hall event.

“We need to decide as a nation how much power the government should have over our data and over our privacy,” Cook said.

In his opening remarks at the event, Cook noted that Apple will be celebrating its 40th anniversary next week, and talked about how Apple’s products are “an important part of people’s daily lives.”

The iPhone is “a deeply personal device,” Cook said, and Apple has a “responsibility to protect your data and your privacy.”

Cook continued:

We did not expect to be in this position at odds with our own government. But we believe strongly we have a responsibility to help you protect your data and protect your privacy. We owe it to our customers and we owe it to our country. This is an issue that impacts all of us and we will not shrink from this responsibility.

With neither Cook nor the FBI backing down, the judge’s decision in the San Bernardino case could have far-reaching implications in other legal cases, and in the broader encryption debate.

The world is watching and waiting to find out what happens next.

Image of apple with zipper courtesy of


> In the DOJ’s motion, the government said that an unnamed third party came forward on Sunday (20 March) to demonstrate for the FBI a “possible method” for unlocking the iPhone of San Bernardino shooter Syed Rizwan Farook, who was killed in a gun fight with police after the attack.

I’ve been wondering what John McAfee was doing these days….


If you think John McAfee would ever go unnamed then I guess you must be thinking of a different John McAfee than the John McAfee I’m thinking of!


Instead of unknown providence how about:

noun: provenance
the place of origin or earliest known history of something.
“an orange rug of Iranian provenance”
synonyms: origin, source, place of origin; More
birthplace, fount, roots, pedigree, derivation, root, etymology;
“the provenance of the paintings”
the beginning of something’s existence; something’s origin.
“they try to understand the whole universe, its provenance and fate”
a record of ownership of a work of art or an antique, used as a guide to authenticity or quality.
plural noun: provenances
“the manuscript has a distinguished provenance”


In short people really shouldn’t be relying on 4 digit pins, if they are really concerned about their privacy. They should instead utilize a full password at least 8 characters long on their mobile device. That’s why Apple created touch id. Why limit your passcode to 4 digits when it takes a thumb to unlock your phone. You would need to enter your passcode again every 3 days.

The version of the iPhone in question doesn’t have full hardware based encryption enabled, so there is a vulnerability. Even the newer iPhones does not utilize full hardware based encryption. This is technically not a design flaw, but allows for the opportunity to decrypt the phone when you have physical access to the phone, and can directly read the flash memory.

What the iPhone does offer, is the protection from someone picking up your phone and accessing your data simply by copying the memory off your phone. This is what forensic analysts and adversaries used to do. By encrypting the content of the memory, this is no longer directly possible. If someone is able to forensically copy the memory of the phone, there is a small but technically feasible attack vector with the standard 4 digit pin.

According to Apple’s documentation, a set of keys are placed in flash storage in a section that is effaceable. This is a section of non-volitle memory that acts as if it was volatile memory. (memory that persists without power vs memory that loses information without power) When voltage is applied to the memory circuit it instantly resets the memory to zero. This effectively cryptographically wipes the iPhone’s memory, providing that physical compromise of the device has not occurred. (in this case law enforcement wanting to hack the device) If a full forensic copy of the memory was taken, then the adversary of forensic analyst would simply need to reload the contents of the memory (still encrypted) and the phone’s state is reset.

Law enforcement or another adversary, cannot clone the memory onto other devices to try in parallel or to “hide” the attempt. (Scenario: an adversary clones the memory of your phone to work on it while you are in possession of your phone, perhaps they clone it at a coat check) They would need physical possession of the device. This is because a hard coded encryption key is stored in the main processor. This key is immutable, it cannot be wiped or changed. It also is accessible only by the processor’s crypto function either in the encryption co-processor or a protected module in the main processor in Windows based laptops.

The version of the iPhone in question is not stateful. The phone has no concept that the failed login attempt timer has been met, instead it stores the state in the flash memory of the device. This allows an adversary to reset the delay timer simply by rebooting the phone. It also means that, by restoring the changed memory on the flash storage, the phone will think that no failed attempts were made. This may require a restart, as the state is maintained in the volatile RAM. The newer iPhones are protected from this attack vector since the TPM maintains the state information, as evidenced by the fact that one cannot override the failed attempts timeout by simply rebooting the phone.

Full hardware based encryption would store only the non-critical encryption keys outside of the hardware security module (HSM). This is the type of device that Apple, other large software vendors and certification authorities protect their signing keys, or basically the master keys for trusted software and website identity proof. In high-security HSMs, the keys are stored in memory that is volatile. The device has a long-life battery to maintain the memory, if the device loses power for a sufficiently long time, the key is wiped, and the keys would have to be restored from multiple backup key components. The HSM also has tamper resistant features, that when the casing is opened, the memory is wiped. It detects opening, through gas detection change, humidity, air pressure, physical contacts, to detect the interior of the device is now accessible.

This particular method may be problematic for users of consumer phones. Ask yourself how many times your phone has run out of power to where it shuts off, and you can’t get to a charger for a while. If this happens, then all your content, photos, etc would be effectively gone. From a customer satisfaction and usability, this is too much security for a mobile device. Also how many times have you had to take your phone in to get a screen repaired or to dry out the insides when it fell into a pool. By taking your phone into repair and having it opened would result in your phone wiped. Adding physical tamper resistance to a consumer device, may not be a valid option. For a company or agency issued phone, it may be a viable solution, since the company or agency often has backup controls, and would prefer to have the device wiped rather than have the exposure to compromised data.

A possible design enhancement would have been to embed the effaceable memory inside of the crypto co-processor, to hold the critical keys such as the file system key (the second most critical key) and the class keys (4th most critical key). The crypto co-processor could have tamper resistant features, that if the chip suspects attempts to compromise itself, it would wipe the keys. This was probably designed to stay outside of the crypto co-processor likely due to patent issues. Tamper resistant features are only partially effective. There may not be a cost benefit to designing tamper resistant features on the chip itself. In laboratory conditions and with test devices, one can perfectly recreate the environment to thwart the tamper resistant features. For instance if the detection mechanism worked on circuit conductivity at a particular resistance level, and air pressure, working on it in an air lock at the same pressure would bypass the air pressure trigger. To bypass the conductivity aspect, using cooled wires, would add negligible resistance to the circuit that would result in being able to close the circuit with the resistance level within the threshold. (a threshold is needed, because of temperature changes)

The most critical key would still be the master key hard coded in the processor, the one that provides the equivalent of full disk encryption, anything written to the flash memory gets processed by this key. The file-system key resides in flash memory protected by only the master key. The passcode generates a key using a combination of the master key and a key generation algorithm from the passcode. This is the 3rd most critical key as it encrypts the class keys. The class keys encrypt the metadata for each file depending on the file access requirements. This tells the phone where the file resides, how big, and what the file encryption key is. The file system key is used to decrypt the file encryption key from the metadata. Then the file encryption key unlocks the actual file contents. This complexity is probably the reason that law enforcement originally needed Apple’s help, there is a lot of complexity in the encryption scheme. And it takes a while to understand the weak points in the system.

The actual documentation does not indicate that the per-file key is actually encrypted by the class key as well as the file system key. If this is true, then an attack vector without using the passcode is viable. One would have to read the burned key through physical interaction with the processor between the coprocessor or tap the circuits to read the “master key”. Then decrypt the file system key, ignore the password key and class key, and locate the metadata, apply the file system key to the metadata, and apply the resulting key to the rest of the memory. This would take immense processing power but could result in recovery using massively parallel computing. This would have to be done for each file, but would result in successfully recovering the data.


it would leave the central disagreement about encryption backdoors unresolved

I’m likely splitting too fine a hair, but no matter which side wins when this goes to the highest court, this disagreement holds too many valid concerns (and more importantly, earnest supporters) on either side to ever resolve.


Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!