Skip to content
Naked Security Naked Security

How is Apple doing in its fight for #nobackdoors?

The court of public opinion doesn't seem to have decided who to support yet. How would you argue the case for or against?

In the battle between Apple and the US government over security backdoors, it’s hard to say who is winning and who is losing, not least because the fight is far from over.

Apple prevailed in a court case brought by the US government earlier this month, when a judge in New York rejected the government’s arguments for unlocking a convicted drug dealer’s iPhone and ruled in Apple’s favor.

And just this week, the government dropped its case against Apple in California, saying that the FBI had found a way to access data on a locked iPhone connected with a dead terrorist, and therefore no longer needed Apple’s help.

This will no doubt be a relief to privacy advocates, civil liberties groups, and numerous technology companies and organizations that have sided with Apple.

The court of public opinion, however, still seems divided.

The Pew Research Center conducted a survey of 1002 Americans in late February, shortly after Apple’s refusal to comply with a court order requiring it to break the security on an iPhone seized after the San Bernardino mass shootings.

In that survey, 51% said Apple should unlock the iPhone, while 38% said Apple should not unlock the iPhone and 11% were unsure.

On the other hand, another survey, conducted by Reuters, went the other way – 46% said they supported Apple, 35% sided with the FBI, and the rest were unsure.

If the comments that followed when we explained Sophos’s own stance against backdoors are anything to go by, Naked Security readers are strongly in favor of #nobackdoors

…but there still seem to be plenty of people out there who think that you can strengthen security by weakening it.

How would you argue the case?

Note. You may comment anonymously simply by leaving the name and email fields blank in the comment form.


Image of sliced apple courtesy of Shutterstock.com.

26 Comments

I would argue that there are too many ways that the means to access these back doors will land into the hands of criminals and terrorists.

They will be shared with the UK government and left on a train. Don’t laugh http://news.bbc.co.uk/1/hi/uk/7449927.stm

Apple employees and/or bent cops will take bribes, or worse yet will sympathise with ISIS.

Hackers the world over will attack iphones looking for the breach now that they know it’s there.

They will be much more useful to the bad guys than to the good guys, because the bad guys don’t have huge resources compared to governments. ISIS would love the chance to search Washington for iphones which have visited gay rights websites.

The government doesn’t have the right to force anyone to give anything to them. The federal government needs to be put back in their place. I hope they don’t push the public to violence in order to see freedom return to we the people.

i suspect that it it the protection of the freedom of ‘we the people’ that is at stake here, not the federal government (or any government of a free people) itself’s ‘place’. And as such, and as ‘our’ government(s), we should expect them to use ‘all necessary means’ to obtain potentially life-threatening information from suspects, dead or alive, in order to be able to have a chance of saving lives and our freedoms.

I think you are missing the fact that the government is here to serve us, and NOT violate the Constitution or it’s Amendments. All of us would lose, along with the country and the government.

It won’t be long until we have the option of hardware encryption and there won’t be any back doors. They will be SOL (Sh*T Out of Luck) in an attempt to crack it. This maneuver is temporary as hardware advances and without the passphrase there is no decode, at least in this century. What would they do with my hard drive, it’s encrypted with public (GNU) software and if I don’t use it enough, I forget it… like a couple days in jail, forget all my passphrases. The real security here is the ‘fail the code’ erases the data, simple and secure.

US and UK politicians and agencies always seem to use the terrorism stick to scare citizens and feed paranoia but it’s not always about using stolen smartphones for evil or terrorism and it’s not always ISIS that we need to be scared of. To be honest I think that most of us simply don’t want back doors in our mobile computing devices because we don’t want thieves accessing our finances or our (or our kid’s) personal photographs.
I think that if there are existing vulnerabilities that Apple aren’t aware of or techniques that can be used to access data that should be protected by a passcode then Apple should be told so that they can patch them

Well, the phone got unlocked. This means there is some unknown third party out there that has some unknown procedure for unlocking it. Kinda gives you a warm fuzzy feeling, Not!

It didn’t necessarily get unlocked, and the FBI is carefully not saying that it was. Or wasn’t.

https://nakedsecurity.sophos.com/2016/03/29/fbi-cracks-that-iphone/

It’s a kind of weird that currently iPhone 5 and iPad 2 can be easily unlocked via Siri and nobody talks about…

I firmly believe that there should be back doors controlled by Apple in this case or the particular party offering the software, or hardware in general. Like wire-tapping phones, use of the back door should require a warrant from a judge or court on a case-by-case basis. No, I don’t want to protect the suspect’s rights, and I believe that Apple and other designers can create back doors that are unavailable to the public, perhaps via proprietary hardware.

Let’s examine the above statements:

Wire tapping: wire tapping intercepts data in transit via “man in the middle”. This is a completely different beast than enabling third party access to data at rest. Wire tapping is transient, and will only capture the data that is currently being sent. In order to wire tap, you have to have physical access to the connection at the time the conversation is in progress, and it targets a specific recipient. Wire tapping by a third party is possible, but not very beneficial. Pen registers collect more information, but once again are targeted. Back doors have to be baked in, and are always present, not just when a court order shows up.

Suspect’s Rights: Anyone can be a suspect. This doesn’t mean the suspect did anything wrong, just that someone suspects they might have done something wrong. Waiving rights because someone is suspicious is waiving liberty to gain a feeling of security.

Tools unavailable to the public: What we’ve seen since the dawn of computers is that tools unavailable to the public inevitably end up in the hands of the powerful, both those using them for justice and those using them for personal gain. With over seventy years of examples where flaws and back doors in information security cause more problems than solutions (you could argue that allied forces breaking German encryption during WWII was a good thing, but the rest of the world wasn’t using the same system), and not one example of a back door that remains unavailable to the public for any period of time, the burden of proof for this one lies with anyone making the claim that it’s possible, not with the field of experts who think even attempting it is a really bad idea.

+1.

A backdoor is like setting up a permament wire-tap (a TI, or telephone intercept, in British English) on everyone, recording every every conversation ever, and storing the transcripts in a giant cupboard under the stairs.

Then you hope that the cupboard will only ever be accessed by authorised parties, who will only ever access individual data items that they’ve been given a warrant for, who will only ever use that data for the express purposes listed in the warrant, and who will never let anyone else see that data, whether by accident or by design.

And, of course, you hope that no crook will ever figure out what that giant cupboard is for.

Is this a safe way to do things?

Chelsea (neé Bradley) Manning and Edward Snowden suggest otherwise, along with hundreds of other “we’ll look after your data, honest!” data breaches in recent times.

I hate to pull the old man card on you, but those that ignore history are doomed to repeat it. The 60’s up has shown that no ‘backdoor’ has ever survived long before it’s secret is out. You can’t prevent disgruntled employees or others selling or using this access. We have an Amendment that guaranties our person from unreasonable search and seizure. Don’t worry, it won’t be long before nobody can decode it without the passphrase, court order of not…

Yes, there should be. However for this to happen Apple would need to only employ perfectly trustworthy unicorns. Since they don’t exist Apple won’t be able to guarantee retaining control of the back doors.

I hope there will be just as much pressure put on the government to responsibly disclose to Apple how they accessed the data on the iPhone 5C so that Apple can inform its customers if there is a security vulnerability that affects modern iPhones.

The role of law can’t be superceded by a newfound right to digital privacy.
If the current security models (for banking, etc) are at risk if a court orders that a suspect’s data be made available to the court, then I suggest that a new security model is needed.

The right to privacy, at least in the U.S., is not a “newfound” right. Digital, analog, doesn’t matter, it’s still a basic right that should not be abridged without a really good reason (search warrant, based on reasonable evidence of wrongdoing).

I have one simple opinion to should Apple backdoor their stuff or not. No. The conversation could go on all day but NO is my final answer, period.

NoBackDoor – A backdoor means
1. you must be able to trust the holder of the key not to give your stuff away – not just today in a democracy but always
2. you must trust that the holder of the key will never lose it …
3. once the door is ther it will eventually be opened, no need to add extra entry points

From this and other tech sites, it looks like the FBI most likely cracked that iPhone by taking it apart, copying the flash storage chip, and then repeatedly tried pass-codes, using a fresh copy of the storage chip every time they ran out of guesses.

If that is the case, then that is the kind of back door that I am fairly relaxed about. Technically the FBI where able to breach Apple’s security, but in practice, the technique is not remotely exploitable, it is extensive, time consuming, and cannot be done without the knowledge of the suspect. The costs and time involved means that it can only be used against serious criminal suspects, and can’t be used for mass surveillance. The fact that it destroys the phone in the process means that it can’t be covertly used against political opponents without alerting them.

In my view, this sort of backdoor actually represents a reasonable compromise between the needs of law enforcement and civil liberties, and if apple where to design in a deliberate feature into future iPhones, so that they could be cracked in that way, then I would be happy, and would not be deterred from buying one.

Does that work? Or is the ever-decreasing count of remaining guesses saved in the “secure enclave” along with the encrypted AES key, where it can only be reset at the same time as the device is reflashed?

What are the costs? pure and simple… ask ourselves how much does Google, Yahoo, Amazon and a whole host of others know about you? Why are some so shocked and outraged when any law enforcement agency applies in a very public way to obtain a lawful search warrant about murder, terror and the like from a phone of a killer now deceased (thanks to the police) who’s phone was owned by a second party (the County) NOT the personal property of the killer? Apple’s argument was a “what if”, it was ALL SPECULATION “If we give in they will be able to do this to everyone”

Is there is a nuance missed by some? It is what Apple is NOT saying. By denying assistance to law enforcement in effect Apple is taking the side of the terrorist are they not?…Even a killer like the instant case, and/or ANY POTENTIAL killer, deserves to keep ALL OF THEIR TERROR PLANNING SECRET. But they couch it all for YOU the poor everyday citizen who is being spied upon by the mean old FBI every day, everywhere. Trust Apple ..we’re a corporation who makes $ from you. We believe in profits, and therefore we should be trusted with ALL of YOUR INFORMATION…that you provide us because we really care about you.

Somewhat ironic that Apple (prior to this case) was completely open to receiving subpoenas and cooperating with law enforcement court orders for YOUR INFORMATION and they gave it ALL UP WILLINGLY . Correct me if I’m wrong but I didn’t see Apple cite one instance on when they GAVE AWAY ALL YOUR INFORMATION to the FBI before this that the FBI did such bad things with it. Only AFTER creating this NEW device did they all of a sudden have this epiphany that they also “marketed” to all our “POOR CONSUMERS” whom we respect more than those bad old FBI agents and Federal Judges. That’s why we have ALL OF OUR PHONES MADE IN CHINA we trust them! The Chinese government would NEVER think of hacking into ANY DATA that Apple already keeps on our dear consumer friends..Nor would they ever devise a plan to plant a hacking devise into phones during production…no never. After all we at APPLE just sold thousands of our I-Phones to the US Government ( check DHS purchase of new IPhones to ICE) we at Apple stand on integrity, we will NEVER give those mean old government people ANY ASSISTANCE TO STOP TERROR AND KILLING…but we’ll be MORE THAN HAPPY TO SELL THEM AN I-PHONE. Rest safely dear customers…

The 51% that think “Apple should unlock the iPhone” is the 51% that think Apple has a “Magic Key” to open any iDevice they want. The people that don’t quite understand that this is a tool that would need to be invented and then handed over.

Someone did it already for them That’s why the FBI “bought” it…again they (Apple) and others perhaps yourself I don’t know? want it ONLY ONE WAY why does Apple “trump” public safety? They built it “Intentionally” without a means to comply to anyone? I can think of a whole host of products and companies don’t do that. Where is the fairness/ balance in that? One would think the entire world would be irreparably harmed by having an “unlocking” device.

First and foremost show me where ALL of this harm is right now, not some “maybe” not some “it could be” not some “well we don’t trust them” Just articulate for me (this includes APPLE TOO) right now this very minute the “harm” that has been caused by ALL of previously manufactured APPLE brand products that THEY THEMSELVES have turned over information on based upon a lawful warrant and it went rogue and worldwide harm was caused “based upon the information supplied” Someone just articulate the “FACTS” that by APPLE turning over information (as it has routinely done in the past on ALL of their other products), that it caused APPLE to all of a sudden intentionally build a device that has no means to do that.

I just find it strange that Apple seems to think that the government is asking them to travel down a “dangerous” path when someone apparently had it already and sold it to the FBI.
So in light of that event, even Apple doesn’t know what’s out there do they? Furthermore, that entity that sold it to the FBI (supposedly) has been suggested and by no means confirmed is thought to be the Israelis… so if the FBI bought it from someone else then again Apple…
where is the “harm” that you speak about? It was already out there and you didn’t even know about it. The FBI could have just bought it never mentioned anything about it and just said we give up (publically) and you wouldn’t have ever known.

Comments are closed.

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?