Skip to content
Naked Security Naked Security

FBI asks Apple to help it unlock iPhones of naval base shooter

This could signal a renewed war between Apple and law enforcement over breaking encryption.

The FBI has asked Apple to help it unlock two iPhones that belonged to the murderer Mohammed Saeed Alshamrani, who shot and killed three young US Navy students in a shooting spree at a Florida naval base last month.
Alshamrani also injured eight others before he himself was shot to death.
Late on Monday, FBI General Counsel Dana Boente sent the letter to Apple’s general counsel. The letter hasn’t been made public, but the FBI shared it with NBC, which first reported on it.
In the letter, the FBI said that it’s got a subpoena allowing it to search content on the iPhones, both of which are password-protected (and one of which Alshamrani reportedly shot and damaged, further complicating forensics on the device and its data). But so far, investigators haven’t had any luck at guessing the passcodes, the letter said.
And yes, the FBI has tried the tactics it used when it was trying to unlock the iPhone of San Bernardino terrorist Syed Farook. Namely, the bureau says that it’s asked for help from other federal agencies – it sent the iPhones to the FBI’s crime lab in Quantico, Virginia – and from experts in other countries, as well as “familiar contacts in the third-party vendor community.”
That could be a reference to the tool that the FBI used to finally break into Farook’s encrypted phone and thereby render moot the FBI versus Apple legal battle over encryption.
Though the killer was believed to have been acting alone, the FBI said in its letter that it’s not ruling anything out before the investigation is complete:

Even though the shooter is dead, [agents want to search his phones] out of an abundance of caution.

Apple sent a statement to NBC saying that it’s helping the government:

We have the greatest respect for law enforcement and have always worked cooperatively to help in their investigations. When the FBI requested information from us relating to this case a month ago, we gave them all of the data in our possession and we will continue to support them with the data we have available.

Why a letter and not the usual court battle?

Last month, responding to Apple and Facebook reps who testified about the worth of intact encryption, Sen. Lindsey Graham had this to say about the government’s ongoing quest for a backdoor:

You’re going to find a way to do this or we’re going to do this for you.

The letter might be part of a strategy to “do this for you.” As in, it could well be yet another way to get Apple to put in a backdoor that will bypass the iPhone encryption that stymies agents’ investigative work. Backdoors are a product-crippling move that Apple has declined to take in spite of the FBI’s many demands to do so since the case of the San Bernardino terrorists.
Throughout that encryption battle and beyond, Apple CEO Tim Cook has taken a firm stand, publicly stating Apple’s refusal to break its own encryption. So why is the FBI asking now, when it already knows the answer?
The Register reports that some are seeing the letter as part of a new strategy. In a cogent commentary, it noted that there are four things that have changed since the 2015 San Bernardino case:

  1. The state where the crime happened. Florida now has case law forcing password disclosure. In 2017, Florida ordered reality TV star Hencha Voigt to unlock her iPhone in an extortion case. Miami-Dade Circuit Judge Charles Johnson ruled that she and an alleged accomplice could be forced to hand over their smartphone passcodes without violating their constitutional right against self-incrimination.
  2. The country’s top prosecutor. US Attorney General William Barr has made it clear that he wants encryption backdoors. In October, he and officials from the UK and Australia signed an open letter calling on Facebook to back off from its “encryption on everything” plan until it figures out a way to give law enforcement officials backdoor access so they can read messages. (Facebook said no.)
  3. Tim Cook’s charm offensive. Apple’s head has been deliberately making nice with the GOP, touting the company’s contributions to the US economy …during the course of Donald Trump’s tenure.
  4. This time, the FBI really did try everything. In March 2018, the US Department of Justice’s internal inspector general came out with a rather damning report about the San Bernardino investigation, noting that the FBI “did not pursue all possible avenues in the search for a solution” before contacting Apple.

This time, as the FBI letter makes clear, it has indeed explored all possible avenues to crack iOS encryption. That leaves only one possible next step: backdoors. Meanwhile, the stars are aligned with Florida case law, the country’s top legal brass is itching for an encryption fight, and it could well be that to the DOJ’s way of thinking, Apple has taken on the aroma of appeasement.
The bricks have been carefully laid. The US has learned its lessons from the 2015 San Bernardino tussle over encryption and won’t make the same mistakes. The atmosphere, both legally and politically, has shifted since 2015.
What could come next: a court fight to force Apple to unlock Alshamrani’s iPhones.
Readers, your thoughts: would the FBI have a better chance to win this time?

15 Comments

“Readers, your thoughts: would the FBI have a better chance to win this time?”
Nope, if their well funded by the US crackers in israel couldn’t break into it, apple isn’t going to be able to.
If Apple did put in a back door for future situations, then people will start using readily available free 3rd party encryption that can use even longer keys. The anti-encryption drum beaters will not stop criminals from using these tools no better than any anti-X laws prevent criminals from using them.

Reply

I’m pretty sure that I’ve heard this quote from somewhere:
“If you outlaw something, only outlaws will be able to get it.”
If they did indeed undermine encryption in software in general, my personal take on this is that it’ll make it easier for criminals to steal from the people (because the general population’s encryptions would be backdoored), while making it more difficult for law enforcement to catch said criminals, because most likely they would be then only ones who uses adequate encryption without backdoors to protect themselves.

Reply

once everyone knows there is a backdoor, bad guys won’t use iphone anymore

Reply

Am I missing something? Even if Apple were forced to add a backdoor to their encryption for new iPhones, and “upgrade” existing iPhones with the backdoor, how would that help the FBI break the encryption on an existing iPhone which doesn’t have that backdoor?

Reply

Monies spent on trying to get access, would be moot in the future if backdoors were already there. They could just snoop anywhere for anything. We know from early computer days, any kind of access like this is a major risk.

Reply

When the government gets access, people will go elsewhere. This would be a crippling blow to our technology industry.

Reply

That’s what the US public service did, at least in part, to its fledgling cryptographic software industry in the 1990s – the EXPORT_GRADE restrictions of that era (e.g. 40-bit symmetric key maximum for exported ciphers) did US software vendors no favours.
IIRC, big US companies could *import* strong crypto, e.g. from UK companies, and then use it in both their US and overseas offices because the software they deployed overseas was never exported from the US. But if they “bought local” they would be stuck with weak crypto for their offices outside the US… so it was easier just to get an overseas office to buy non-US crypto code for the whole company.
And anyone who bought US software got stuck with code that supported deliberately weak ciphers – which they had to remember never to leave enabled, in case of “downgrade” attacks. Of which there were many…
https://nakedsecurity.sophos.com/2015/03/04/the-freak-bug-in-tlsssl-what-you-need-to-know/
https://nakedsecurity.sophos.com/2015/05/21/anatomy-of-a-logjam-another-tls-vulnerability-and-what-to-do-about-it/
The laws of unintended consequences…

Reply

Yeah, this reminded me also of that fiasco. Naturally hindsight is 20/20 (or 6/6?), but it seems enough experts should’ve been vocal to the point where that wouldn’t have passed.
Clearly *now* the lesson has not been learned, as we’re discussing the same idiocy yet again.
I respect the (ostensible) intent–but these guys are saying no one can have good running shoes in case a cop needs to run after someone, someday, somewhere.

Reply

IMO it gets worse than that, because there are other parts of “the government” saying that you’d jolly well better take some steps, such as buying decent running shoes, to stay ahead of those self-same crooks.
This is a dilemma we have in the UK where one part of the public service is instructing companies like ISPs to keep huge amounts of metadata about all their customers… while another part is urging us all to embrace the spirit of GDPR and to not to collect data unless there is an obvious purpose.
In particular, companies are no longer supposed to collect and keep data because they might figure out something useful to do with in the future. Not just to prevent the overcollection and subsequent misuse that happened in the Cambridge Analytica case, but to reduce the amount of unused data left lying around in cupboards – cupboards that crooks know are well worth searcing for, perhaps for years.

Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!