Security researchers often dump on users for their cruddy password practices. But what about the developers who write the code that’s supposed to keep our passwords safe?
…as in, what’s up with the developers who fail to properly encrypt/salt/hash, who use outdated password storage methods, who copy-and-paste code they found online (vulnerabilities and all), who leave passwords sitting around in plain text, or who don’t understand the difference between encryption and hashing?
There have only been a few studies looking at how developers handle end-user password storage, even though such work is primarily involved with the security of those passwords. After all, reusing a password can have dire results for an individual, but a developer failing to hash and salt a database can lead to a far more widespread problem.
One such study, from 2017-2018, used computer science students as lab rats to examine how developers deal with secure password storage.
Saving face
The results: they didn’t. Without explicit prompting, none of the students implemented secure password storage. When asked why they didn’t, many of them said they would have, if they’d been creating real code, for a real company, for a real project that would actually see the light of day, as opposed to writing for an academic study.
As it was, the students were told that the task was to create a university social networking website, but they knew no real data would be under threat if they made a mistake.
More recently, researchers from the University of Bonn decided to redo the experiment. This time, though, they’d use “real” developers, lie to them about the work being just a study, pretend instead that the code was for a real startup, and pay them around €100-€200 (USD $112-$225).
The results: there’s no difference. Students and “paid” developers recruited from Freelancer.com seldom use secure password storage unless prompted, and even then, they have misconceptions about how to do it. They’re also using outdated methods.
From the study:
Our sample shows that freelancers who believe they are creating code for a real company also seldom store passwords securely without prompting.
In addition, we found a significant effect in the freelancers’ acceptance rate between the €100 and €200 conditions for the prompted task and examined the effect of different payment levels on secure coding behavior. We saw more secure solutions in the €200 conditions, although the difference was not statistically significant. However, this result might be due to the small sample size and we believe this is worth following up in future work.
The not-real real-life project
For the recent study, the researchers changed the described task from a university social networking platform to a sports photo-sharing social network. To make it more believable, they created a web presence for the company, and they posed as company employees when they hired the freelance developers. They told the freelancers that they’d just lost their developer and needed help to finish the registration code.
They posted the project on Freelancer.com, stipulated that they needed Java skills, and offered €30-€100 (USD $34-$112), with an expected working time of 1-15 days. In the final study, they jacked that up to €100-€200.
Of the 260 developers whom the researchers narrowed it down to, only 43 took up the job, which involved using technologies such as Java, JSF, Hibernate, and PostgreSQL. They paid half of them €100, and they paid the other half €200, in an effort to figure out if paying more would get them more password security.
Then, the researchers created a playbook to make sure their interactions with the freelancers were consistent. For example, if a developer asked if he or she should store passwords securely, or if a certain method was acceptable, the researchers answered “Yes, please!” and “Whatever you would recommend/use.”
If a participant delivered a solution where passwords were stored in plain text in the database, the researchers replied, “I saw that the password is stored in clear text. Could you also store it securely?” Those participants were marked as having received a security prompt.
We deliberately set the bar for this extra request low, to emulate what a security-unaware requester could do; i.e., if it looked like something hashed, we accepted it.
The researchers said that the freelance developers took three days to submit their work, and that they had to ask 18 of the 43 to resubmit their code to include a password security system after they’d first submitted a project that stored passwords in plaintext.
Most of the developers who were asked to resubmit their code – 15 out of 18 – hadn’t been explicitly told that the user passwords should be stored securely. Out of that non-prompted group, one of the developers actually asked whether he should… but before researchers had replied, within three hours, he had already handed in a plaintext project.
Misconceptions
Both students and freelancers suffered from some misconceptions, the researchers said, but not necessarily the same ones. While the students confused password storage security with data transmission security, some of the developers treated encoding as if it were a synonym for encryption.
Eight of the freelancers stored user passwords in the database by using the binary-to-text encoding scheme Base64 – basically, a way to jumble input so it’s readable by a different type of system, not so that the information is kept secret from prying eyes. One of them argued that “the clear password is encrypted” and that “It is very tough to decrypt.” The developers were also confused by MD5, which is a hashing function.
In fact, out of the secure password storage systems the developers chose to implement, only two of them – PBKDF2 and Bcrypt – are considered secure.
Only 15 of the 43 developers used salting, which makes encrypted passwords harder to crack by adding a random data factor. The study also found 16 examples of “obviously copied” code: code that the researchers found had been copy-and-pasted from online sources, rather than having been developed from scratch, and which could have been outdated or filled with bugs.
Don’t expect developers to know you need security
The lesson that the researchers came away with: keep your security expectations low.
Even for a task which – for security experts – is obviously security-critical, like storing passwords, one should not expect developers to know this or be willing to spend time on it without explicit prompting: ‘If you want, I can store the encrypted password.’
…but then again, there might be other takeaways from a study like this…
You get what you pay for
We can understand why this study may lead some developers to rage-throw their laptops out the window. How many programmers would interpret the lowball €100 project offer as a signal that the work would just be a placeholder, destined to be rewritten before it went live as part of the purported photo-sharing social network?
On the plus side of these study results, the “how many developers” question can be re-framed like this: only 43 of the 260 developers whom the researchers approached took up the job. That’s only 16.5%, and Naked Security’s Mark Stockley thinks that’s a good thing:
It’s reassuring that so few developers were prepared to take this on. Perhaps, instead of criticising the small number of developers prepared to work down to a price instead of up to a standard, we should applaud the silent majority that seem to have rejected an undeliverable brief out of hand.
That said, the research should act as a reminder to buyers that security rarely happens by accident: you have to make it important in your projects. It should also serve as a reminder to developers that clients often don’t know what to ask for, so if they don’t raise the issue of security with you, you need to speak up.
It’s as reasonable to assume that the 83.5% of coders who ignored the job realized that it was a security disaster and therefore that most coders have both knowledge and scruples, as it is to draw inferences from the desperate 16.5% who were prepared to do days of coding for $112.
For what it’s worth, the researchers wound up paying all the participants €200, in order to be fair.