Skip to content
Naked Security Naked Security

The FBI said on Monday that it figured out how to unlock the iPhones of the shooter who killed three young US Navy students and injured eight at a Pensacola, Florida naval base in December 2019.
No thanks to you, Apple, Attorney General William P. Barr said in a news release:

Thanks to the great work of the FBI – and no thanks to Apple – we were able to unlock Alshamrani’s phones.

Barr has on multiple times issued public calls for encryption backdoors.
On Monday, the AG joined FBI Director Christopher Wray in a virtual press conference. Barr used the opportunity to once again call for a “legislative solution” to the roadblock of Apple’s encryption, while Wray referred to the FBI’s “Apple problem.”
Both gave FBI workers a pat on the back for the months they spent working to unlock the damaged iPhones.
In January, following the shootings, the bureau had asked Apple to help it unlock two iPhones that belonged to murderer Mohammed Saeed Alshamrani. Also in January, the Department of Justice (DOJ) said that its investigations showed the incident was an act of terrorism, motivated by jihadist ideology. On 2 February, al-Qaeda in the Arabian Peninsula (AQAP) claimed responsibility for the shooting spree.
The FBI had gotten a subpoena allowing it to search content on the iPhones, both of which were password-protected and one of which Alshamrani put a bullet hole through, further complicating forensics on the device and its data.
An FBI press release related to Monday’s conference included a photo of the hole in one iPhone and of an iPhone alert saying “Authorized Service Provider Only.”

Photos of Pensacola shooter's iPhones
Photos of Pensacola shooter’s iPhones. IMAGE: FBI

Wray said that FBI agents had spent months trying to crack the iPhones – hours, days and weeks of hard work that otherwise would have been spent protecting Americans from terrorists if not for what he called law enforcement’s “Apple problem.”
He praised the FBI staff’s hard work…

I want to thank and congratulate the men and women at the FBI who devoted months of hard work to accessing these devices. They successfully tackled a problem that required tenacity, creativity, and technical expertise.

…and then lashed out at Apple for not making it easier to do:

The technique that we developed is not a fix for our broader Apple problem.

The “broader Apple problem” refers to apps with end-to-end, warrant-proof encryption: apps that Alshamrani and his AQAP associates deliberately used in order to evade law enforcement while communicating their plans.
Wray didn’t give details on the technique used to crack Alshamrani’s iPhones. What he did say was that the FBI tried every encryption-bypass tool and technique out there, but that none of them worked:

We canvassed every partner, and every company, that might have had a solution to access these phones. None did, despite what some claimed in the media. So we did it ourselves.

When it asked for Apple’s help in January, the FBI said that it had tried the same tactics it used when it was trying to unlock the iPhone of San Bernardino terrorist Syed Farook. Namely, it asked for help from other federal agencies – it sent the iPhones to the FBI’s crime lab in Quantico, Virginia – and from experts in other countries, as well as “familiar contacts in the third-party vendor community.”
The last was seen as a possible reference to the tool that the FBI used to finally break into Farook’s encrypted phone and thereby render moot the FBI versus Apple legal battle over encryption.

Apple: Backdoors weaken security for all

Throughout the San Bernardino encryption battle and up until the current battle over Alshamrani’s locked phones, Apple CEO Tim Cook has taken a firm stand, publicly stating the company’s refusal to break its own encryption. Apple has steadfastly maintained that weakening encryption so as to give law enforcement a backdoor would weaken security for all, giving bad actors a foothold into getting at innocent people’s data.
We’ve just as steadfastly concurred with that line of thinking: saying #nobackdoors and agreeing with the Information Technology Industry Council that “Weakening security with the aim of advancing security simply does not make sense.”
In response to the FBI’s request for help in January, Apple had said that, short of breaking its own security technology, it was helping the government in any way that it could:

We have the greatest respect for law enforcement and have always worked cooperatively to help in their investigations. When the FBI requested information from us relating to this case a month ago, we gave them all of the data in our possession and we will continue to support them with the data we have available.

On Monday, Apple’s response to Wray’s wrath was in keeping with everything that it’s already said on the matter. Apple’s statement:

The false claims made about our company are an excuse to weaken encryption and other security measures that protect millions of users and our national security. It is because we take our responsibility to national security so seriously that we do not believe in the creation of a backdoor – one which will make every device vulnerable to bad actors who threaten our national security and the data security of our customers.
There is no such thing as a backdoor just for the good guys, and the American people do not have to choose between weakening encryption and effective investigations.

Apple blamed for delaying investigation

The DOJ succeeded in cracking the iPhones. Why, then, does it feel the need to berate Apple? This time around, it’s blaming the company’s encryption for causing a delay in the related national security investigation. The company’s refusal to weaken encryption meant that its agents didn’t know what to ask or what to look for, Wray said. Therefore, they wasted time tracking down anything and everything, incapable as they were of zeroing in on the likeliest leads:

In the aftermath of the attack, we and our Joint Terrorism Task Force partners worked urgently to collect and analyze evidence. In the weeks immediately following December 6, we conducted over 500 interviews of witnesses, base personnel, and the shooter’s friends, classmates, and associates – among many other efforts. Because the crucial evidence on the killer’s phones was kept from us, we did all that investigating not knowing what we do now: valuable intelligence about what to ask, what to look for. If we had, our round-the-clock, all-hands-on-deck effort would have been a lot more productive.

The terrorists have been able to use that time to their advantage, Wray said, concocting and comparing stories with co-conspirators:

As a result, there’s a lot we just can’t do at this point that we could have done months ago.

For his part, Barr reiterated his belief that things can’t go on this way:

…if not for our FBI’s ingenuity, some luck, and hours upon hours of time and resources, this information would have remained undiscovered. The bottom line: our national security cannot remain in the hands of big corporations who put dollars over lawful access and public safety. The time has come for a legislative solution.

Carving up privacy one chunk at a time

Those “legislative solutions” are in the works. The FBI’s most recent fuming over Apple’s encryption is only the latest of several legislative moves to carve up Americans’ privacy. One such: the EARN-IT Act, or the Eliminating Abusive and Rampant Neglect of Interactive Technologies Act. The bill would require tech companies to meet safety requirements for children online before obtaining immunity from lawsuits. You can read the discussion draft here.
To kill that immunity, the bill would undercut Section 230 of the Communications Decency Act (CDA) from certain apps and companies so that they could be held responsible for user-uploaded content. Section 230, considered the most important law protecting free speech online, states that websites aren’t liable for user-submitted content.
Strip away Section 230, and platforms like Facebook, Reddit, and even end-to-end encrypted apps like WhatsApp and Signal would essentially have to give the government a backdoor to any and all customer information.
In other “let’s carve chunks out of privacy” moves, the Senate voted last week to reauthorize the Patriot Act while also renewing warrantless searches of web histories. The USA Freedom Reauthorization Act restores government powers that expired in March with Section 215 of the Patriot Act.
The bill has already been approved by both the House and Senate. All that’s left now to make it law is for Congress to iron out the differences in their drafts and for President Trump to sign the completed version.

Latest Naked Security podcast


Click-and-drag on the soundwaves below to skip to any point in the podcast. You can also listen directly on Soundcloud.


“The technique that we developed is not a fix for our broader Apple problem.”
Replace the word Apple with, Constitution or Human rights, and that’s that fact of the matter.
I don’t doubt that eventually only criminals and politicians will have encryption, since there are so many dictator types in politics these days, thinking they alone should make ill-informed decisions for everyone else. When that happens should we call them Criminician? Poliminals? Polcriticianls,,,, Dick-tators.


A backdoor for the government is only a backdoor for the government until one person leaks it. Then it’s a backdoor for everybody. And earn-it is NOT a good idea… how about usps, ups, or fedex get charged anytime they deliver items or correspondence that is either illegal or aids in illegal activity? They are just as responsible for what people “post” in the real world as facebook is for what people “post” online… just a clever way for the government to pass the burden of censorship (and thus public ire as well) onto corporations.
That said, I think that if the government can get a warrant for information on an encrypted device, and the parent company already has a way to get in, then there’s a case to be made for opening the device up. After all, that’s what warrants are for isn’t it? But intentionally putting a security vulnerability into every single encryption technology (and then, presumably, giving insider knowledge of the vulnerability to the government where it will absolutely never be abused) just defeats the purpose of encryption in the first place…


> They are just as responsible for what people “post” in the real world as facebook is for what people “post” online
Except in this case (when dealing with Apple), it’s not even that. It’s like charging the makers of a safe because they designed it to be difficult to open and someone decided to use it to store drugs.


And why do you think Alex Azar moved to do away with Net Neutrality?
Nothing good for the end user to be sure.


would this be the same Apple that happily records conversations of its customers, and whoever the customer is talking to, even when they have not invoked Siri? Or the same Apple that hides its huge profits in tax havens?


Exactly, the same ones who will steal and use (aka sell) your data for themselves but for appearances stand up for the public demanding security.
Idiots fall for this.


In order to work, any of the “personal assistants” (Siri, Hey Google, and Alexa) need to listen for the keyword. If you don’t want these apps listening, simply turn off that option.
Apple automatically erases any recordings after a short time.


The rest of the world can safely ignore this USA crap and get on with designing and selling “secure” appliances.
Want a secure phone just get it from “somewhere else”… Maybe Apple will have US and International iPhone models :-)


We only need to look to one of the Founding Fathers for his thoughts:
“He that would make his own liberty secure must guard even his enemy from oppression; for if he violates this duty he establishes a precedent that will reach to himself.”
― Thomas Paine
“When we are planning for posterity, we ought to remember that virtue is not hereditary.”
― Thomas Paine


I’m afraid our founding fathers are spinning at light speed from what we’ve done to our constitution… :(


So…FBI took less than 6 months to ‘crack’ encryption that should take thousands of years? Clearly the iPhone isn’t as secure as we want to think, regardless of *how* the thing was cracked


I highly doubt that the FBI actually gained entrance. It’s more likely that they hired a sub contractor to do it. Which means they can get it done, but don’t want to. Whom do you trust Apple, or the FBI? I don’t trust either. Apple might also be considering that fact that if/when they put a backdoor in, their phone sales will tank. Will they prosecute GNU or me if I encrypt my laptop/desktop disk with open source encryption code? Another good reason for Android to go to more of an open Linux platform with open source, at least for the basic system. They would then have to prosecute all of us…
Of course if you look at the history of the FBI, they have been trying to mangle the Constitution for decades.


They likely virtually cloned the device, (no limit of tries), on multiple super fast systems running cracking/guessing tools at the log in, resetting every time the device gets locked. Not going to take long at a few thousand guesses per min, running it on multiple system, starting at different combination on each instance – attacking the virtual devices in parallel. That’s what I would do with unlimited resources anyways.


Apple: OK, we’ll give you the code, but you have to pretend that you guys cracked it yourself, not that we gave you the code, yeah? Like, for real.
FBI: Yeah, yeah, no problem. We’ll do it real good. Make y’all look like assholes for not giving in. *giggles*
Apple: K. Here’s the code.
FBI: Schweeet!


Apple is right; DOJ is wrong (again). It is not Apple that should help the DOJ; it is DOJ that should help Apple make its products and services more secure. Step up DOJ!!


Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!