The Dark Web is reflecting a little more light these days.
On Monday I wrote about Memex, DARPA’s Deep Web search engine. Memex is a sophisticated tool set that has been in the hands of a few select law enforcement agencies for a year now, but it isn’t available to regular users like you and me.
There is another search engine that is though.
Just a few days before I wrote that article, on 11 February, user Virgil Griffith went onto the Tor-talk mailing list and announced Onion City, a Dark Web search engine for the rest of us.
The search engine delves into the anonymous Tor network, finds .onion sites and makes them available to regular users on the ordinary World Wide Web.
Up to now the best way to search for .onion sites has been to get on the Tor network using something like the Tor browser, but Onion City effectively does that bit for you so you can search from the comfort of your favourite, insecure web browser.
The site can do this because it’s a Tor2web proxy – a bit of software that acts as a go-between for the regular web and the Tor network.
It acts as a Tor client inside the Tor network and presents the sites it finds as regular web pages using subdomains of .onion.city.
One of the consequences of working this way is that Onion City search results are just regular web pages like any other, which makes them visible to you, me, the Onion City search engine and also, for the first time, Google.
In fact, Onion City’s search functionality is a Google Custom Search, so if you can find something on Onion City you should be able to find it on Google too.
At the time of writing, there are about 650,000 Dark Web pages that have found their way into the regular Google index via Onion City.
Of course – as any small business owner can tell you – just because Google knows a website exists doesn’t mean the site’s pages will rank well. But those pages are at least in the mix now, enjoying their first rays of sunshine.
I took a quick look around and, because this is the Dark Web, I searched for amphetamines, 9mm ammunition and hackers for hire, and yes, it’s all in there.
If you’re tempted to do the same, I encourage you to read the rest of the article before you start.
Onion City isn’t doing anything wrong and it’s not ‘outing’ anyone on the Dark Web, it’s just providing a means for regular web users to search things they would otherwise have to work a little harder to find.
It reduces the barriers between people who want to find or consume something and people who want to provide it in a way that’s untraceable.
Any sites that don’t want to be indexed by Onion City can exclude themselves from the index in the same way that regular sites exclude themselves from regular search engines, using a humble robots.txt file (although that does mean they appear in a public list of sites that don’t want to be indexed).
Onion City users would be foolish to use it for anything illegal though because users get no protection whatsoever.
If you want to browse the Dark Web without leaving a trail, Onion City can’t help – you still need to be on the Tor network using a Tor browser.
This is a Tor2Web proxy so the Tor part where the .onion sites reside is as secure as Tor and the web part where you and I reside is as insecure as the web (it isn’t even available over HTTPS yet.)
What we users get is convenience, nothing more.
If you’re wondering why that would be useful, just think about sites like Wikileaks.
Wikileaks is a clearing house for materials provided by whistleblowers. Up to now it has had to choose between being vulnerable and public or safe and difficult to find (a situation that’s resulted in it taking out some bizarre protection.)
With Onion City, Wikileaks would have the option to retreat into the safety of the Tor network without sacrificing visibility.
Like all frontiers the Dark Web is lawless, exciting, disorganised and potentially dangerous. Frontiers don’t last though; people move in, the law moves in and basic utilities are established.
Thanks to Onion City, the cyber-frontier now has it’s first set of street lights.