Apple filed a motion in a California court yesterday, asking the judge to throw out the order compelling Apple to assist the FBI in unlocking an encrypted iPhone, and calling the US government’s demands a “dangerous” overreach of its constitutional powers.
Apple’s motion comes after a district court judge ordered Apple last week to create special software that would allow the FBI to pull data from an iPhone belonging to Syed Rizwan Farook – one of the shooters in the December terrorist attack in San Bernardino, California.
The company had until today (26 February) to respond to the court order.
Apple has been using the court of public opinion to argue its case for more than a week – saying that unlocking the iPhone would require Apple to create a backdoor to defeat its own security.
Tim Cook, Apple’s CEO, said in a note published on the company’s website that Apple would not comply with the court’s order.
To do so would put millions of Apple customers at risk, and undermine security features designed to protect iPhone users from hackers and government surveillance, Cook said in his letter and in media interviews.
In its legal motion to vacate the judge’s order, Apple contends that the case is not merely about a single iPhone, but rather the government’s grab for power that would violate the constitution, set a dangerous precedent, and go against the will of Congress.
Apple’s motion to vacate is a 36-page document that lays out a multi-faceted argument, including an explanation of the technical issues involved, the legal precedents, and a detailed unraveling of what Apple calls the government’s flawed understanding of the law.
Ultimately, this case hinges on the court’s interpretation of a 1789 law called the All Writs Act, which gives courts the authority to issue writs (orders) “necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.”
The All Writs Act does not give the government the authority to force Apple into creating code that does not exist in order to do the government’s bidding, the company says.
The iPhone in question in this case, an iPhone 5c running a recent version of the Apple iOS operating system, is locked with a passcode and the only person who knows the passcode – Farook – is dead.
The FBI wants Apple to create a new version of iOS that would allow it to “brute force” the passcode, using software to make millions of guesses at possible passcode combinations in a matter of seconds until finding the right combination to unlock the device.
From a technical perspective, Apple argued in another court case that unlocking an iPhone running recent versions of iOS (iOS 8 or higher) is “impossible,” because Apple does not store the passcode or the unique ID used to create a key to encrypt the device.
Now Apple concedes that unlocking the iPhone is possible, but to do so would require Apple to create special software to bypass the iPhone’s security, taking engineers and other Apple staff weeks to accomplish.
Creating the software would open a Pandora’s box, Apple says.
Apple would need to take exceptional measures to protect all knowledge of the backdoor from getting out and being exploited by criminals and foreign governments.
This backdoor is “too dangerous to build,” Apple says.
Creating a backdoor to the terrorist’s iPhone should not even have been necessary, Apple says.
If the FBI had consulted Apple it could have provided technical assistance to get a backup of all data on Farook’s device from his iCloud account.
Instead, by resetting Farook’s iCloud password, the FBI lost the opportunity to get a backup of the data by connecting to a known Wi-Fi network.
From a legal perspective, Apple argues that Congress has passed a law – the Communications Assistance for Law Enforcement Act (CALEA) – that excuses companies like Apple from aiding the government in cases where it does not have a copy of the encryption key.
If the court follows the government’s interpretation of the All Writs Act to compel Apple to create a backdoor in this case, it would set a dangerous precedent.
Apple said that not only would that mean the government could demand assistance in thousands of cases, most not involving terrorism, it could also demand that Apple develop other kinds of software to track suspects, such as creating code to remotely turn on a device’s microphone or camera.
In the end, Apple says these issues should not be decided by a judge behind closed doors, but with a robust, public debate.
You can read Apple’s motion in full here.