In the heated debate over encryption, backdoors, and a locked iPhone at the center of a terrorism investigation, facts have sometimes been overwhelmed by rhetoric.
Yesterday, we got to hear a lot of facts (and some rhetoric too), in a Congressional hearing about whether the US government can legally compel Apple to defeat its own security measures to unlock an iPhone at the request of law enforcement.
FBI Director James Comey and Apple General Counsel Bruce Sewell appeared before the House Judiciary Committee, where they presented prepared testimony and answered questions from lawmakers for several hours.
Cyrus Vance, Jr., district attorney for New York, and Dr. Susan Landau, a respected expert on cryptography, also testified before the committee.
Comey, who has talked for almost two years about the problem of criminals and terrorists “going dark” through the use of encrypted communications and devices, acknowledged in his testimony that “we’ve been talking past each other in the tech community and the government.”
In the interest of finding facts we can agree on, let’s review what’s happened in the past few weeks and look at what new information we learned from yesterday’s Congressional hearing in Washington.
Quick overview – how we got to this point
There is a long history over several decades of governments demanding backdoors to defeat encryption, but in recent months the temperature of the debate has risen, after the terrorist attacks in Paris last November, and a mass shooting in San Bernardino, California by self-proclaimed followers of Islamic State.
In pursuit of its investigation into the San Bernardino terrorist attack, the FBI wants to access data on an iPhone that was possessed by one of the shooters (who is now dead), although the device actually belonged to his employer, San Bernardino County.
The iPhone is protected by a passcode, which means the device is encrypted and data on it cannot be read without the passcode, which is unknown.
The US prosecutors in this case sought a court order, which was granted two weeks ago, compelling Apple to create a new version of iOS that the FBI could load onto the shooter’s iPhone.
This new code would allow the FBI to “brute force” the passcode, by making unlimited guesses until finding the right one.
Why the FBI can’t get data off the iPhone
The San Bernardino shooter’s device is an iPhone 5c running the most recent version of Apple’s mobile operating system, iOS 9, and it is protected with a four-to-six digit passcode on the lockscreen.
Since iOS 8, released in September 2014, encryption has been enabled by default, and Apple does not store the encryption key.
As Apple made clear in its response to the court order, it has no way to unlock the device without creating new code to defeat its own security.
Because the iPhone is running iOS 9, there are two protections in place that Apple designed to prevent the compromise of the passcode: an auto-erase function that deletes the device’s data after 10 failed passcode attempts; and a timing delay between passcode guesses.
Without these limitations, the FBI could brute force the passcode in 26 minutes, Comey said.
The third technological solution the FBI is seeking is a way to submit passcode attempts electronically, without using the touchscreen.
In yesterday’s hearing, Comey admitted that the FBI made a “mistake” in the hours after the attack when it told San Bernardino county officials to change the shooter’s Apple ID password to access the iCloud account.
By doing so, they inadvertently cut off any possibility of the iPhone backing up its data automatically through a known Wi-Fi network.
Apple said it could have assisted the FBI in accessing that data through iCloud if it had first asked for technical assistance, which Apple has been providing in this and other cases.
Is the FBI seeking access to just this one iPhone?
Apple says this is about more than just a single iPhone in the San Bernardino case – and that is undeniably true.
The FBI has also asked for an iPhone to be unlocked in a New York drug case, although its authority to do so was ruled unconstitutional by a judge earlier this week.
According to the judge’s ruling in that case, there are a total of 12 pending cases in federal courts where the US government is seeking access to a locked and encrypted iPhone.
Yet there are potentially thousands more cases at the state and local level where law enforcement is in possession of locked iPhones.
Vance, the New York district attorney, testified that his office alone has 205 devices it wants unlocked, and there are another 100 in Texas and many more cases in other states.
Apple believes that going forward with the FBI’s solution for unlocking one iPhone would set a precedent that other courts would use to require a similar solution in other cases.
When asked whether the San Bernardino case would set such a precedent, Comey appeared to agree, saying: “sure, potentially.”
Comey also said the code the FBI is seeking could only be used on an iPhone 5c, and would not work on later iPhones such as the iPhone 6 or 6s.
But Sewell said that “nothing would preclude it from being used on any iPhone that’s out there.”
Why is Apple refusing to write the code?
In its legal brief to the California court, Apple argues that code is equivalent to speech, and believes that the court’s order to create code that does not exist is a violation of the First Amendment, because it would be “compelled speech.”
Apple also argues that creating the code would be “forced labor,” and a violation of the Fifth Amendment.
Beyond that, Apple says creating the code is “too dangerous” because it creates a backdoor that could be exploited by hackers or nation states, potentially putting millions of innocent iPhone users at risk.
Comey said in his testimony that he has “a lot of faith” that Apple can protect the code from falling into the wrong hands.
But in her testimony, encryption expert Landau said it is impossible to guarantee the security of the code, because it would be “the target of organized crime and nation states.”
Landau co-authored an influential paper with other cryptographers and security experts laying out a detailed technical rebuttal to the effectiveness of backdoors.
She said the government’s request for a tool to get around the passcode and encryption on an iPhone is “a security mistake” for several reasons.
One reason: requiring Apple to perform the unlocking procedure on many iPhones would require a new process, using the unique ID of each phone and digitally signing the code before loading it onto each device.
“It becomes a routine process and routine processes get subverted,” Landau said.
Where do we go from here?
Comey said he believes there can be a solution that will satisfy law enforcement and the tech companies, but such a solution does not exist today.
The FBI could look for its own backdoor to the iPhone without Apple’s assistance, but Comey said it hasn’t found a way in and other US government agencies (such as the NSA) did not offer a solution either.
In the meantime, these cases will continue to play out in the courts.
Congress could pass a law that requires Apple and other technology companies to create code such as what the FBI is seeking; or it could pass a law that does the opposite, prohibiting the creation of any backdoors.
At this point, however, a political solution seems to be as far off as a technological one.
Image of phone hacking courtesy of Shutterstock.com.