Skip to content
Naked Security Naked Security

Apple CEO Tim Cook sticks to his guns: “No encryption backdoors”

Tim Cook of Apple told the CBS show 60 Minutes that "if you put a backdoor in, that backdoor's for everybody." So, no, Apple won't do it.

Apple CEO Tim Cook appeared on CBS’s 60 Minutes TV show last night.

As you can probably imagine, the topic of encryption came up, in particular the issue of what backdoors.

Bluntly put, a backdoor is a deliberate security hole – for example, an undocumented master decryption key – that is knowingly added to a software product.

Some backdoors are there as a temporary convenience, for example to speed things up during development, a bit like wedging your real-life back door open while you’re shuttling the garbage out into the yard.

But temporary software backdoors have a way of getting forgotten, and ending up in production builds, which is a strong argument for avoiding backdoors in the first place, convenience notwithstanding.

Some backdoors are there as a “feature”, for example so that the support desk can help you more quickly if you are on the road and forget your password, without needing to read you a lengthy, one-off recovery code that you have to type in within a limited time.

But backdoors like this soon end up widely known, and widely misused, which is a strong argument for avoiding backdoors in the first place, convenience notwithstanding.

Lastly, some backdoors are requested by law enforcement or a country’s regulators, supposedly as an aid in fighting crime.

The claim is that strong encryption that can’t be cracked gives criminals and terrorists an unfair advantage, because it means they can communicate without fear of their conversations being eavesdropped or investigated.

Unbreakable encryption, say its detractors, is as good as contempt of court, because crooks can laugh at search warrants that they know can’t be carried out.

But Tim Cook told 60 Minutes that he doesn’t agree:

Here’s the situation…on your smartphone today, on your iPhone. There’s likely health information, there’s financial information. There are intimate conversations with your family, or your co-workers. There’s probably business secrets and you should have the ability to protect it. And the only way we know how to do that, is to encrypt it. Why is that? It’s because if there’s a way to get in, then somebody will find the way in. There have been people that suggest that we should have a backdoor. But the reality is if you put a backdoor in, that backdoor’s for everybody, for good guys and bad guys.

Which is a strong argument for avoiding backdoors in the first place, convenience notwithstanding.

Indeed, mandatory cryptographic backdoors will leave all of us at increased risk of data compromise, possibly on a massive scale, by crooks and terrorists…

…whose illegal activities we will be able to eavesdrop and investigate only if they too comply with the law by using backdoored encryption software themselves.

In other words, Tim Cook is right: if you put in cryptographic backdoors, the good guys lose for sure, while the bad guys only lose if they are careless.

We know this because we have tried enforcing mandatory backdoors before, and it did not end well.

In the 1990s, for example, the US had laws requiring American software companies to use deliberately-weakened encryption algorithms in software for export.

The US legislators intended that these export-grade ciphers would make it safe to sell cryptographic software even to potential enemies because their traffic would always be crackable.

But the regulations ended up affecting Americans in a double-whammy:

  • International customers simply bought non-US products instead, hurting US encryption vendors.
  • EXPORT_GRADE ciphers lived on long after they were no longer legally required, leaving behind backdoors such as FREAK and LOGJAM that potentially put all of us at risk.

Those who cannot remember the past are condemned to repeat it.

💡 LEARN MORE – To encrypt or not to encrypt? We explore the issues ►

💡 LEARN MORE – The FREAK bug, a side-effect of weakened encryption ►

💡 LEARN MORE – The LOGJAM bug, another side-effect of weakened encryption ►

Image of open doors courtesy of Shutterstock.

11 Comments

If back doors are for everyone, are front doors also for everyone?

Reply

Loosely speaking, and by analogy, a “front door” in a cryptographic system is the regular one where you provide your own key material to lock or unlock.

In other words, an expected entry with consistent access control.

Reply

What about for convenience? ;-)

Reply

convenience is king! I always leave my front door wide open, removed the locks months ago.

My walls have interesting paint patterns–and I thought I recalled a sofa and television here–but I never dig pockets for keys while carrying groceries into the house.

Reply

Ah, I know what you’re doing wrong. You really do need to lock the door properly with the strongest locks available to keep the crooks out. Then, to get the convenience you need, put the keys under the second plantpot to the left.

(That’s the trick, you see. Everyone thinks it will be under the *first* plantpot, so as long as I don’t mention the secret of the second plantpot on any public websites, and we keep it to ourselves, you’re golden!)

Reply

I chuckled out loud at this and then shuddered at how analogs to the “Second Pot Strategem” are taken as serious security by more than (just you and me…shhh)

but I’m okay now; it’s funny again.
:-)

Reply

From my perspective here in the US, there’s hyper fear-mongering due to 2016 presidential politicking exploiting terrorism. IMHO, security and privacy advocates need a better message for the average American to explain the problem with backdoors for encryption, or we’re going to wind up doing something sillier than banning the export of cryptography.

Reply

That’s the thing: the US didn’t *ban* the export of cryptography back in the 1980s and 1990s. It just set official standards for export that resulted in the widespread deployment of cryptographic tools that the US could crack…and so could everyone else. In short: you can’t improve cryptographic security by deliberately weakening it.

Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!