Skip to content
Naked Security Naked Security

Resetting terrorist’s Apple ID password wasn’t a screwup, says FBI

The FBI first blamed county employees for resetting the password and foreclosing the option of fresher backups. Now, it says it was in on it.

No, the FBI says, changing the password on the San Bernardino terrorist’s iCloud account was not a screwup.

Prior to Saturday night, when the FBI released a “we did it on purpose” statement, some media outlets were reporting that county workers had gone ahead and changed the password on their own.

They did not, the FBI said. Rather, it was done by the county and the FBI working in conjunction.

The reason why this is all so important is that by resetting the password, the investigators got access to Syed Rizwan Farook’s backups, but only as recently as 19 October: weeks before he and his wife, Tashfeen Malik, allegedly carried out a mass shooting that killed 14 people and seriously injured 22 at the Inland Regional Center in San Bernardino, California.

In doing so, the bureau rendered it impossible to access later backups that could have contained information posted closer to the date of the attack.

The pair died in a shootout with law enforcement.

The Justice Department and Apple are in agreement over at least one thing: resetting the iCloud password means that getting a more recent backup – one that followed the 2 December incident – is out of the question.

Senior Apple execs said on Friday that were it not for the password change, the company may have been able to get at more recent backups of the information the government was after.

As Buzzfeed News reports, Apple execs told reporters that the company had proposed four different ways to recover the information the government wants, all without building a backdoor into its iOS encryption.

One method would have involved connecting the iPhone to a known Wi-Fi network and triggering an iCloud backup that might have delivered information stored on the device between 19 October and the date of the mass shooting.

The execs said that Apple had sent trusted engineers to try it.

But the engineers found that the Apple ID password associated with Farook’s iPhone had been changed sometime after his death: within 24 hours of the government having gotten their hands on it, in fact.

Changing the password obviated the chance to get at a fresh copy of the device data via the known-Wi-Fi-network method Apple had planned.

On Friday, the FBI blamed the resetting on the phone’s owner – Farook’s employer, the San Bernadino Health Department.

The FBI wrote in a court filing that somebody at the department had reset the password remotely:

The owner, in an attempt to gain access to some information in the hours after the attack, was able to reset the password remotely.

Not so, the San Bernardino County’s official Twitter account promptly stated.

Rather, “The County was working cooperatively with the FBI when it reset the iCloud password at the FBI’s request.”

In the statement put out late Saturday night, the FBI said that Farook’s iPhone was already locked when it was seized during a search on 3 December, making it “a logical next step” to get at iCloud backups and whatever evidence they may have held.

The FBI now says that it worked with San Bernardino County to reset the password on 6 December so as to get immediate access to the backup data.

Could investigators have gotten more evidence from fresher backups than 19 October?

As of Saturday, the FBI was shrugging at the idea:

It is unknown whether an additional iCloud backup of the phone after that date – if one had been technically possible – would have yielded any data.

As Naked Security’s Paul Ducklin remarked, forensic analysis of computer devices is hard enough if you have controlled offline access to a copy that can be replaced at will.

It’s way harder when you have the cloud in the loop, as well.

In such a situation, investigators are over a barrel: do they cut the phone off and shut it down in the hope of preserving where things were, in case the crooks try to wipe it?

(Of course, the phone user in this case was already dead, taking the chance of wiping his phone with him. But what if an accomplice had wiped it?)

Or do they leave it turned on and connected, in case they can’t ever get into it again?

Or then again, do they wait for a hardware engineer from the company in question to show up and try a method that well might have worked to get them the information they were after?

At any rate, the FBI still wants to unlock that iPhone, regardless of iCloud backups, it said, given that the phone itself will likely offer more data than a backup could:

Through previous testing, we know that direct data extraction from an iOS device often provides more data than an iCloud backup contains.

Even if the password had not been changed and Apple could have turned on the auto-backup and loaded it to the cloud, there might be information on the phone that would not be accessible without Apple’s assistance as required by the All Writs Act order, since the iCloud backup does not contain everything on an iPhone.

As the government’s pleadings state, the government’s objective was, and still is, to extract as much evidence as possible from the phone.

As far as Apple is concerned, the government is seeking an “unprecedented use” of the All Writs Act, a 1789 law that authorizes US federal courts to “issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.”

Gizmodo’s Kate Knibbs notes that the government has leaned several times on the act to compel Apple and other communications companies to help them with investigations.

Not all judges are sympathetic to uses of the writ, which, Knibbs notes, can be like handing judges a blank check if interpreted broadly.

But it has been used against communications companies in the past.

In 1977, the Supreme Court set a precedent for allowing its use to compel a telecom’s cooperation with the government as it conducted surveillance to set up a racketeering sting.

In a separate statement published on Lawfare, FBI director James Comey told people to stop freaking out about what he called the government’s pleadings, which constitute an order that Apple unlock Farook’s encrypted iPhone.

This isn’t about breaking encryption or setting loose some master key, Comey said.

Rather, it’s about developing software to enable brute-force password attacks.

Specifically, a US judge last Tuesday – 16 February – ordered Apple to develop a way to skirt an iPhone auto-erase security feature that kicks in after 10 failed password attempts.

The government simply wants that security feature disabled, Comey said:

We simply want the chance, with a search warrant, to try to guess the terrorist’s passcode without the phone essentially self-destructing and without it taking a decade to guess correctly. That’s it. We don’t want to break anyone’s encryption or set a master key loose on the land.

Apple CEO Tim Cook has said that there’s nothing simple about it.

The government is after an encryption backdoor, Cook says: a deliberately programmed weakness that enables computer security to be sidestepped whenever it suits you.

“You” being the government, or maybe a rogue employee, or crooks. Where there’s a backdoor, there are inevitably people who know about it.

Tech companies have lined up to back Apple’s refusal to break its encryption.

Sophos is one of those companies, and it’s put out a #nobackdoors pledge that explains why.

Sophos gave the pledge its own page here:

Comey writes that he hopes people will “take a deep breath and stop saying the world is ending, but instead use that breath to talk to each other.”

I hope thoughtful people will take the time to understand that. Maybe the phone holds the clue to finding more terrorists. Maybe it doesn’t. But we can’t look the survivors in the eye, or ourselves in the mirror, if we don’t follow this lead.

That’s a heart-rending argument.

But as we wrote when Sophos made its #nobackdoors pledge, this case goes far beyond the phone of one terrorist, and that’s Apple’s point:

Apple is determined to stand its ground, arguing that to create a programmatic backdoor, even in a dramatic case like this, would open a password-cracking Pandora’s Box.

To backdoor one iPhone would effectively betray all of Apple’s many millions of law-abiding customers, and pave the way for similar writs against other American companies and their customers.

Image of iPhone passcode courtesy of ymgerman / Shutterstock.com

22 Comments

If Apple has its own backup of their user database with salted-and-hashed passwords prior to the reset, it seems to me they could just go get the previous hash and put it back in, effectively resetting the password to what it was before. Then the phone could back itself up. Surely I’m not the only one who thought of that, so there must be some reason why they can’t, but it would be interesting to know why not.

That is a pretty good idea. In theory, anyway. In practice, assuming that Apple does have such a backup (they’d be crazy not to make backups, but old ones might be discarded by now), a few things off the top of my head might mess it up.
TL;DR: Apple has to make a LOT of considerations for its millions of other users and devices, and these complicate what would otherwise be a straightforward database edit.

1. Finding the backup. Which datacenter? Which tape? Are the DCs all redundant? (Tape drives are still insanely fast, much faster than hard drives or SSDs, so most enterprise-level businesses use those for backups. You can write to an entire tape in a few seconds.)
2. If the database backup takes up multiple tapes, they need to find all of them.
3. Backup integrity. Is it still usable? Most likely, it hasn’t been that long. But magnetic storage media (floppy drives especially, which they won’t be using here, thankfully) are infamous for losing data after sitting idle. I have *no* experience in database programming (database people think WEIRD), but I’d imagine that one bad tape, or a few bad records, could cause retrieval/reading/parsing issues.
4. Apple might not have the architecture in place to do a per-user restore like that manually.

1 and 2 can be rather easily dealt with by having labels and organization for tape storage. A big company like Apple would be insane not to have this, so these concerns can likely be ignored.

3 and 4, however, are legitimate concerns. Like I said in point 3, it hasn’t been that long since the change was made (~4 months), so the tapes are most likely usable. I’d be surprised if improvements in data retention haven’t been made since the days of 3.25″ floppies. As for point 4, it could be a “web portal only” sort of thing, where they can only change it based on user input, or just hit a “reset password” button that assigns something random. They might *need* to edit the database manually, and that might cause all kinds of trouble for everyone else.

With #4, they *could* theoretically set up a server specifically for that iphone, and redirect only its traffic to that server, but that would take a lot of time to set up and test. Don’t want anyone else’s iPhone to try to authenticate against a machine running from months-old backups.

what kind of backup tape are you writing in a few seconds? Multiple TBs on tape in LTO5, LT06 or LTO7 would take a lot longer than a few seconds. Backup to (SSD) disk, then backup to tape after would be superior. Also depends on the backup job type, etc… I would bet $19 Billion dollars that there is not single user backups to individual tapes, that would be insane.

Anyway i think you may have made tapes sound way better than they are. I really question your knowledge on enterprise backups…

At the end of the day I would be highly surprised to find out apple uses tape media period.

The FBI screwed up. They should figure out how to write a custom update for the phone not apple. They are lazy.

Tape drives are not fast; they never have been, and I’ve been using them since the late 70s. Disk speeds are orders of magnitude faster.

There are several stories/reports on exactly why that is not possible. Passwords/encryption keys are local to the device and non-recoverable.

I’ve never liked Apple as a company but I am starting to see them in a new light with them standing there ground on this one and good on them! Well done to everyone at Apple who are standing there ground and fighting this.

so what if some drug dealers, terrorist, thieves and murderers use encryption….we must protect the innocent data!

Ahem. The regulators also *insist* that we innocents use strong encryption to protect data we hold on other people – and rightly so!

As has been posted on many sites; encryption tools are available from all over the world that are just as unbreakable. The only thing that will come of allowing the apple hack would be people moving to other products that no government can compel it to be unlocked. It would just make apple unsafe and not solve any future locked data. Who would buy a product that everyone knows there are backdoors into? do any online banking…

Comey Said: “We simply want the chance, with a search warrant, to try to guess the terrorist’s passcode without the phone essentially self-destructing and without it taking a decade to guess correctly. That’s it. We don’t want to break anyone’s encryption or set a master key loose on the land”.
But that is exactly what you want! You want Apple to make up a software program, that will do all that, then GIVE it to you, so you can use it when ever you please. This country, THIS government has become the axis of evil in the world, it spies on the populace, it demands every software program has a back door so uncle sugar can crawl through to get an eyeful of everything that we do. Get out, stay out! Apple should leave this country, lock, stock and barrel and tell the FBI and that stupid judge to go pound sand, what I really personally would tell them is stick the “all writs act” up their ass, and if they cannot find the room give me a call and I will help them!

The Government is trying to Force Apple to produce something that doesn’t exist, and then give it to them.

It sounds like indentured servitude to me. The FBI isn’t offering to pay Apple to develop something. It is trying to use force to make Apple cooperate and give them something.

The Court Order said Apple is to do everything “reasonable” to help the FBI. Writing and testing brand new software isn’t reasonable. No one knows how long it would take to do what the FBI wants. Is it a 10 minute change by 1 guy, or will it take 100 programmers a year to do it. The FBI certainly doesn’t know. Therefore, the request is unreasonable on its face.

I have long said, if the only evidence of a crime exists on an encrypted cell phone, then the Government doesn’t really have enough evidence to make a case. If the only evidence they can find of other parties involvement exists on this phone, I am doubtful there is actually any co-conspirators to fnd.

Can we get simple facts correct, please?

“Of course, the phone owner in this case was already dead, taking the chance of wiping his phone with him. But what if an accomplice had wiped it?).”

Not what you wrote in an earlier paragraph; “On Friday, the FBI blamed the resetting on the phone’s owner – Farook’s employer, the San Bernadino Health Department.”

I think Apple should comply on one condition; that the phone not be out of their control with the custom firmware on it. There would then be no chance it could leak.

So if the FBI had just consulted Apple before having the passcode changed, all of this might have been avoided? Sounds to me like questionable competence and/or motives on the part of the FBI.

So the FBI admits that doing a backup before changing the password may not have yielded any data. But now they’ve burnt that bridge, they’re certain that hacking into the phone via Apple writing custom firmware will yield data?
That’s either hypocrisy, stupidity or they’re manipulating this case into getting a result they want to reuse elsewhere.

With regards to the “known” network and the hazards of leaving the phone on. Both are solvable, with varying degrees of effort. To avoid being wiped, the phone should have been placed in an RF isolation enclosure. The “known” network should have been a clone of the San Bernadino Health Department network, its signal limited to said RF enclosure. The “network” should have been firewalled from the general network, so that only iCloud was reachable. As attorneys say “shoulda, coulda, woulda”.

all of this might have been avoided? Sounds to me like questionable competence and/or motives on the part of the FBI.

Comments are closed.

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?