Skip to content
Rubik's cube
Naked Security Naked Security

Campaigner who refused to hand over passwords found guilty

Muhammad Rabbani was found guilty of obstructing justice after refusing to unlock his laptop and smartphone

Last November, Muhammad Rabbani was detained and questioned by border police at Heathrow Airport in London, where they demanded he hand over the password and PIN that would unlock his laptop and smartphone.

He refused, citing that the devices contained confidential data connected to the case of a man he’d just met in Qatar and who alleged he’d been tortured while in US custody.

According to Rabbani, who works as international director for campaign group CAGE, he’d been stopped from entering the UK before but had never been asked to give up his PINs. Charged under the Terrorism Act 2000, he later offered this explanation at a hearing in May 2017:

There were around 30,000 (documents) which I was especially uncomfortable handling and I felt an enormous responsibility to try and discharge the trust that was given to me.

Earlier this week, Rabbani was found guilty of obstructing justice, given a conditional discharge and fined £620 ($830) in costs.

Normally, that would be that, but this case is different, starting with the fact that Rabbani’s organisation, CAGE, is controversial for reasons too involved to explore here.

From a privacy and security perspective this incident, and the subsequent court case, may have implications for the thorny issue of when you can be legally compelled to reveal encryption keys in the UK, and perhaps beyond.

In the UK, people can be charged with not providing encryption keys or passwords under one of two pieces of legislation.

General provisions are provided under the Regulation of Investigatory Powers Act 2000 (RIPA), amended to this effect in 2007, with terrorism suspects covered by schedule 7 of the Terrorism Act 2000 used in this case.

Schedule 7 has been deployed before, notably in the 2013 case of David Miranda, who was forced to hand over passwords for devices containing data connected to the Edward Snowden leaks.

Meanwhile, RIPA was used in 2014 to add four months to the sentence of a convicted terrorist who refused to hand over the password for an encrypted USB key.

The USA, by contrast, lacks specific key disclosure laws, although individuals can still be ordered to do so by a judge, for example in the case of a former policeman accused of storing child pornography who is in jail indefinitely, until he lets authorities into his hard drive.

In a similar vein was the 2014 case of Lavabit, which went out of business rather than hand over the private SSL keys of one of its users, reportedly Edward Snowden.

So, what if anything, does Muhammad Rabbani’s fine for withholding his encryption password mean for the average person, say, travelling to or from the USA or UK?

In reality, at a time when the average Android and iPhone ships with strong encryption turned on by default, things remain as they have been for a while now: anyone going to or from the UK, USA, and a growing list of other countries, can be asked for a device password, whether they are suspected of an offence or not.

People entering the USA are now routinely asked to supply passwords to devices and even social media accounts in order to meet visa requirements or, in some cases, gain admission. Presumably, most people quietly comply for the sake of convenience.

The more secure devices become, the greater their storage, and the deeper the conviction that they might hold data police should be looking at, the more universal these demands become. There is no escaping it. I expect the whole world will be like this soon.

The only way to avoid being asked for an encryption key is not to travel with a device on which such a thing can be used. It’s unsatisfactory but that’s how it is. For people concerned about privacy, this is a depressing choice.


Maybe it’s easier to keep your private data heavily encrypted in the cloud? That way, people can snoop your device to their hearts desire.

Or travel with two hard disks, one n your laptop and one in your hold luggage. Swap them when you get past customs.

Or have two phones. Your real phone, in your hold luggage and a cheap burner phone in your pocket.

I do not advocate crime in any way whatsoever, and support any just and democratic government in protecting its citizens. But I’m pointing out that there are lots and lots, and lots of ways around a guy at customs demanding to the credentials to your laptop or phone, and if you did have something to hide, I’d bet anybody with some brain cells would find a way to do it.


I was reading the story of a journalist who simply wipes his phone to defaults before every trip, then restores from cloud backup once he lands and is through customs.


That sounds even more appealing than a burner. Nice.

Okay, bad guys… don’t read the previous post.


The problem with this is that instead of data being where you can personally manage it, you’re putting it on someone else’s servers, where anyone can access it. You have to ensure that the encryption key protecting your cloud account is protected.

Imagine this: border guard asks for your phone and PIN. Upon opening the phone, they discover that it’s been wiped. They then ask you for your cloud ID and password.

Do you refuse, give them an”empty” cloud ID and password that doesn’t contain much, or give them the real thing?

Meanwhile, when some government agency serves your cloud provider with a gag notice and demands they hand over your account contents, even if they can’t decrypt the contents, they’ve still got the logs showing where you were accessing the data and when.

To make things further complicated, if I travel with work data (which is global), I am required by some countries’ privacy laws not to reveal corporate data. So if an official from one country demands access to that data, what do I do? I violate one nation’s laws if I comply, and another nation’s laws if I don’t.

The only solution I can think of to avoid running afoul of the privacy/security dichotomy is to have plausible deniability baked in to all information systems you use. Of course, this may cause further issues, especially if you use it to hide activity an official legally needs to know about.


Very good points. I’m lucky enough (subjective?) that I’ve not needed to travel for work in a LONG time, so traveling with work data was “off my mind” when I wrote that.

I’m personally not a fan of anything in the cloud (we dinosaurs can resist with futility), but a few docs on Google Drive and my Gmail…well it’s already there anyway. The convenience is palpable with a wiped phone, tied to a barely-used Google account. Lesser of multiple evils?

I think it’s time to invoke H. L. Mencken…
“Explanations exist; they have existed for all time; there is always a well-known solution to every human problem — neat, plausible, and wrong.”


we need a system that can have multiple passwords. the system will open different environment base on different password.


That sounds like a variation on the old technique of plausible deniability used by TrueCrypt and others. If a volume could be made invisible then the user couldn’t be asked or a password to open it.


You can’t put anything with a battery in it in hold luggage and it’ll be x-rayed so it WILL be found. So that’s a no to phones or laptops in the hold for example… You might be able to justifiably say that you don’t know your SocMed passwords because they are stored on and accessed from your home PC (but make sure the accounts are OFF your phone AND LAPTOP, because you’d need to unlock it to “prove” this). The best solution seems to be the advice above to keep a backup in the cloud and wipe the phone to a basic config, possibly with a secondary iCloud account to log in with and maybe a secondary SocMed profile. All of which makes you seem like you have something to hide, of course, but these people don’t care about the difference between something you want to keep private and something illegal that you have to hide
Or just don’t go to these places – and let their consulates and embassies know that you won’t go. Especially if you are a business traveller or, better yet, a decision maker who schedules travel etc. etc.


Conflicting national laws are an interesting notion as referred to in an earlier comment above (you’re damned if you do, damned if you don’t…) Just want to add that in most cases, one should ask themselves if they are legally allowed to “export” this data anyway ? For EU-residents ( and Personal Data…) this gets a new dimension under GDPR.


I think there are two practical steps, good for most people.

1. Leave your normal phone at home and travel through airports with a simple old-fashioned mobile.

2. Have a large VeraCrypt volume on your laptop with a hidden volume. Keep most of your stuff in the outer volume, the password to which you will give up under pressure; and key documents in the hidden volume.


Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!