Anonymous Coward, in commenting on a report from The Register about vulnerabilities that expose people’s browsing histories, pithily sums up potential repercussions like so:
Sweetheart, whats this ‘saucyferrets.com’ site I found in your browsing history?
If you value your privacy and your ferret predilections, be advised that in August, security researchers from Stanford University and UC San Diego presented, during the 2018 USENIX Workshop on Offensive Technologies (WOOT), four new, privacy-demolishing attack methods to get at people’s browsing histories.
The novel attacks fit into two classic categories – visited-link attacks and cache-based attacks – and exploit new, modern browser features such as the CSS Paint application programming interface (API) and the JavaScript bytecode cache: two examples of evolving web code that don’t take privacy into account when handling cross-origin URL data, the researchers say.
So-called history sniffing vulnerabilities are as old as dirt, and browser code has addressed them in the past. Here’s a paper written on the issue back in 2000, and here’s a Firefox bug reported that same year about how CSS page disclosure could let others see what pages you’ve visited.
Old or not, common web browsers – Google Chrome, Mozilla Firefox, Microsoft Edge and Internet Explorer, and Brave – are all, to greater or lesser degree, affected by the new methods of sniffing, the researchers say.
Even most of the security-focused browsers they evaluated – they looked at ChromeZero, Brave, FuzzyFox and DeterFox – coughed up browsing histories in the face of two of their attacks. The Tor Browser alone stood fast against all four attacks: not surprising, since it doesn’t actually store users’ browser histories.
These attacks weren’t just “effective;” at least one of them was sizzling. One of the visited-link attacks – CVE2018-6137, a bug in Chrome 67 that Google fixed in June – peeled off user browsing history at the rate of 3,000 URLs per second. The vulnerability allowed an attacker to figure out whether a link had been visited by using the CSS Paint API to check if a “paint” method – used to change the color of visited links – had been invoked.
Google fixed that one, but that leaves three of the vulnerabilities, as Deian Stefan, assistant professor in the UCSD computer science and engineering department, told The Register. Stefan said the remaining flaws are timing-side channel attacks, which makes them “considerably less severe” than the CSS Paint API attack.
Protecting against these attacks should be taken a lot more to heart by browser makers than it is, the researchers say, given how much browsing history can reveal about somebody: age, gender, location, political leanings, preferred adult sites, and even who they are in the real world.
[O]ne user’s browsing history can spill other users’ secrets, thanks to social networking websites like Facebook and LinkedIn. Anyone who touches a search bar should care about safeguarding this sensitive data.
It should be pretty straightforward, the researchers said: “After all, the web platform provides no direct means for JavaScript to read out a user’s history.” It’s anything but, though, given that in practice, “things get more complicated.”
Browsers still allow web developers to perform a restricted (and occasionally dangerous) set of computations on history data. For example, using the CSS
:visited
and:link
selectors, developers can conditionally style a link based on whether its destination URL appears in the user’s browsing history. And what a developer can do, an attacker can do too – so browsers must account for all kinds of abuse, like exploiting CSS selectors as side channels to “sniff” a URL for visited status.
The four new attacks show that modern browsers are failing to safeguard browsing history data from web attackers, the researchers say.
Various browser features allow attackers to leak this data, in some cases at alarming rates.
As a fix, they propose a “same-origin-style” (SOP) policy: a web application security policy in which a web browser permits scripts contained in a first web page to access data in a second web page, but only if both web pages have the same origin.
In other words: a web page should be able to show you which links you’ve visited on the same site, but not on other sites.
It would carry minor performance hits to browsers and would require a small change to how visited-links are styled, but they think the costs are worth the benefit to user privacy.
Deian said developers involved with the World Wide Web Consortium (W3C) are discussing the three remaining browser history attacks, which involve exploiting CSS 3D transforms, SVG fill-coloring, and the JavaScript bytecode cache. The researchers are also talking with Firefox and Chrome developers about measuring what type of impact a fix would have on existing sites.
kruug
In the RSS feed, the description for this article states “Only one browser stood fast against a set of new browser history attacks.” The article never divulges which browser this was, though.
Mark Stockley
From the article: “The Tor Browser alone stood fast against all four attacks”.
Marc
So, nothing about the Safari web browser? Nothing about private browsing mode? Seems a half baked story.
Brian T. Nakamoto
Why no Safari?
Anonymous
I have several browsers plus Tor, which I only very occasionally use, and I use none of which were mentioned. So I wonder if they checked the browser that I use. What ones do I use _ seamonkey, slimbrowser,and palemoon.I don’t understand why people take the big guys instead of the little guys.
gumtreeblossom
A quick check of the whitepaper tells me that they never tested more than nine browsers: Chrome, Firefox, Edge, Internet Explorer, ChromeZero, Brave, FuzzyFox, DeterFox and Tor Browser.
Perhaps they just didn’t have the time to test others? Usually research studies the most common factors (most used browsers, for this one).