Skip to content
Naked Security Naked Security

FBI expert calls Apple ‘jerks’ as encryption tension simmers

Apple has been called many things in its time but never, as far as anyone can remember, “jerks” by an FBI employee speaking at a public conference.

Apple has been called many things in its time but never, as far as anyone can remember, “jerks” by an FBI employee speaking publicly at the International Conference on Cyber Security in Manhattan this month.
The man who made these remarks – senior FBI forensic expert Stephen R. Flatley – reportedly followed this up by describing the company as “pretty good at evil genius stuff.”
We don’t have the full context of these remarks – was Flatley perhaps being humorous? – but the seriousness of the conflict that prompted the barbs is not in doubt.
It began on the day in September 2014 when Apple launched iOS 8, after which the company said it could no longer access data on an encrypted iOS device – even if asked to by a government agency handing it a warrant.
The technical backdoor that had always been there as a last resort for investigators was sealed. As the company explained this new world:

Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data. So it’s not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.

As far as the FBI was concerned, shutting out investigators was an obstructive decision by Apple, while from Apple’s point of view, it had no choice. It was following the logic of encryption, which is that a security design in which a backdoor exists will end up being equivalent to no security at all.
Flatley also complained that Apple keeps ratcheting up iOS security, recently changing password iterations from 10,000 to 10m. This meant:

Password attempts speed went from 45 passwords a second to one every 18 seconds. […] At what point is it just trying to one up things and at what point is it to thwart law enforcement?

Not coincidentally, Flatley’s boss and FBI director Christopher Wray used the same event last week to argue that encryption backdoors would not compromise wider security, a viewpoint that many in the security industry have vigorously disagreed with for years.
According to Wray, encryption prevented the FBI from accessing 7,775 mobile devices in 2017, without saying how many of these were Apple’s.
It’s the type of statistic that will probably be bandied around more often. Clearly, the FBI thinks – probably correctly – that it plays well with public opinion to keep repeating the argument that unbreakable encryption stymies serious crime investigations.
But it could also be that the simmering conflict between the world’s largest technology company and America’s biggest law enforcement bureau over encryption is becoming increasingly redundant.
Encryption is spreading and improving, regardless of what Apple does. Apple knows this, just as all technology firms do, and wants to be on the right side of technological history should interest in privacy spread, as some believe it will.
Internet users might then face the dilemma of life in two competing universes – one in which they are watched by governments and hemmed in by controls and the other in which private companies offer them a patchwork of partial freedoms out of economic self-interest.

20 Comments

There’s two edges to the encryption sword. If an alleged criminal uses an Apple because of the high energy encryption scheme with no backdoor, that’s good. But if the user forgets the password, there is no way to recover it short of brute force…
An alleged criminal could just as well create a hard copy of an encrypted document that might be of interest to law enforcement. While they might have a warrant that justifies their collecting the document, they still don’t have a backdoor to decrypting the document. In either case the document or phone could be physically destroyed.

Reply

There are no more brute force attacks on iphones. 10 wrong password attempts the phone, automatically goes into factory reset mode. If the owner is diligent in backing up their phone its not a problem.
In Britain the authorities had no idea a phone could be put into factory reset mode remotely. They make an arrest, the suspects phone is confiscated, when it it sent to forensics for analysis, its been erased, took them 8 months for them to figure it out.

Reply

Is “wipe after 10” the default behaviour? I am not sure that it is – IIRC I had to turn in on myself; it was off my default.

Reply

First, Apple is in business to make money. If the government can access information then who would buy that equipment? I wouldn’t (and I don’t) many others would not also. It would make our technology ‘unwanted’ in the IT world. We knew by the 70’s that backdoors were a bad thing. LE will have to find better ways to deter crime or ‘detect’ the possibility of criminal activity in some other way. Like when it pitched over side of a boat in three mile deep ocean. It’s gone.

Reply

Correct answer below :)
and the other in which private companies offer them a patchwork of partial freedoms out of economic self-interest.

Reply

They really need to ban anyone with access to classified material from using devices with known back doors because once they exist you can count on foreign intelligence and likely criminal and terrorist groups having access within months

Reply

I agree.
Given how often classified material ends up getting left on trains, they also need to ban backdoored devices on trains, just to be on the safe side.

Reply

The way I see the ramping up of Apple security is a direct result of the unlawful warrantless surveillance the governments of the world are doing. If the USA government in particular was playing fair and seeking a judge’s permission then maybe the backdoor would have been left open. I’m behind Apple 100% as far a security goes – still prefer Android for my use though.

Reply

Apple wants business and relying on outdated encryption is not conducive to better sales. As the industry develops encryption becomes a necessity not an option. Governments never play fair. Most judges do not have enough knowledge to make good judgment calls when computers and/or networks are involved. That generally leads to a much wider scope of ‘surveillance’ than required. Don’t be under the illusion that a back door can be kept secret, it cannot. My computer experience it was known a back door was where the riffraff sneaked in.
No back doors!!!

Reply

So what does Sophos Mobile encryption do in this case? Will you protect my data if the spooks try to get into my phone?

Reply

AFAIK, we don’t have the keys. You (and usually your employer, or whoever manages the product for you in your business) does.

Reply

So, Apple has millions of phones made in China, and sells millions of them in China, but none of them can be cracked? Somehow I doubt it.

Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!