Skip to content
Naked Security Naked Security

Concerns ignored as Home Office pushes ahead with facial recognition

Picking faces out of a crowd using software is unreliable and fraught with problems - whichever side of the pond you're on

Sure, automatic facial recognition (AFR) has its problems.

Those problems don’t seem to be troubling the UK government, though. As The Register reports, the Home Office is rushing full-speed ahead with the controversial, inaccurate, largely unregulated technology, having issued a call for bids on a £4.6m ($5.9m) contract for facial recognition software.

The Home Office is seeking a company that can set it up with “a combination of biometric algorithm software and associated components that provide the specialized capability to match a biometric facial image to a known identity held as an encoded facial image”. The main point is to integrate its Biometric Matcher Platform Service (BMPS) into a centralized biometric Matching Engine Software (MES) and to standardize what is now a fractured landscape of FR legacy systems.

All this in spite of an AFR pilot having proved completely useless when London police used it last year to pre-emptively spot “persons of interest” at the Notting Hill Carnival (pictured), which draws some 2m people to the west London district on the last weekend of August every year. Out of 454 arrested people last year, the technology didn’t tag a single one of them as a prior troublemaker.

Failure be damned, and likewise for protests over the technology’s use: London’s Metropolitan Police plan to use AFR again to scan the faces of people partying at Carnival this year, in spite of the civil rights group Liberty having called the practice racist.

The carnival is rooted in the capital’s African-Caribbean community. AFR is insult added to injury: the community which police plan to subject to face scanning is still reeling from the horrific June 14 fire at Grenfell Tower, the blackened shell of which looms over the area where Carnival takes place. Out of at least 80 missing or dead victims, many were from this community.

It’s probably safe to say that no group likes to be treated like a bunch of criminals by law enforcement grabbing their mugshots via AFR.

But those with dark complexions have even more reason to begrudge the surveillance treatment from a technological point of view.

Studies have found that black faces are disproportionately targeted by facial recognition. They’re over-represented in face databases to begin with: according to a study from Georgetown University’s Center for Privacy and Technology, in certain states, black Americans are arrested up to three times their representation in the population. A demographic’s over-representation in the database means that whatever error rate accrues to a facial recognition technology will be multiplied for that demographic.

Beyond that over-representation, facial recognition algorithms themselves have been found to be less accurate at identifying black faces.

During a recent, scathing US House oversight committee hearing on the FBI’s use of the technology, it emerged that 80% of the people in the FBI database don’t have any sort of arrest record. Yet the system’s recognition algorithm inaccurately identifies them during criminal searches 15% of the time, with black women most often being misidentified.

That’s a lot of people wrongly identified as persons of interest to law enforcement. According to a Government Accountability Office (GAO) report from August 2016, the FBI’s massive face recognition database has 30m likenesses.

The problems with American law enforcement’s use of AFR is replicated across the pond. The Home Office’s database of 19m mugshots contains hundreds of thousands of facial images that belong to individuals who’ve never been charged with, let alone convicted of, an offense.

Another commonality: in the US, one of the things the House committee focused on in its review of the FBI’s database was the FBI’s retention policy with regards to facial images. In the UK, controversy has also arisen over police’s retention of images. According to biometrics commissioner Paul Wiles, the UK’s National Police Database holds 19m images: a number that doesn’t even include all police forces. Most notably, it lacks those of the largest police force, the Metropolitan Police. A Home Office review was bereft of statistics on how those databases are being used, or to what effect, Wiles said.

How did we get to this state of pervasive facial recognition? It certainly hasn’t been taking place with voter approval. In fact, campaigners in the US state of Vermont in May demanded a halt to the state’s use of FR.

The American Civil Liberties Union (ACLU) pointed to records that show that the Vermont Department of Motor Vehicles (DMV) has conducted searches involving people merely alleged to be involved in “suspicious circumstances”. That includes minor offenses such as trespassing or disorderly conduct. Then again, some records fail to reference any criminal conduct whatsoever.

UK police have been on a similarly non-sanctioned spree. The retention of millions of people’s faces was  declared illegal by the High Court back in 2012. At the time, Lord Justice Richards told police to revise its policies, giving them a period of “months, not years” to do so.

“Months”,  eh? Let’s hope nobody was holding their breath, given that it took five years. The Home Office only came up with a new set of policies in February of this year.

The upshot of the new policies: police have to delete the photos. If, that is, the people in the photos complain about them. And if the photo doesn’t promise to serve some vague, undefined “policing purpose”.

In other words, police will delete the photos if they feel like it.

As The Register notes, there’s simply no official biometrics strategy in the UK, despite the government having promised to produce one back in 2013.

That sounds familiar to American ears. The FBI is also flying its FR technologies without being tethered by rules. For example, it’s required, by law, to first publish a privacy impact assessment before it uses FR. For years, it did no such thing, as became clear when the FBI’s Kimberly Del Greco – deputy assistant director of the bureau’s Criminal Justice Information Services Division – was on the hot seat, being grilled by that House committee in March.

The omission of a privacy impact assessment means that we don’t know the answer to questions such as: what happens if the system misidentifies a suspect and an innocent person is arrested?

Nobody knows, apparently. States have no rules or regulations governing the use of real-time or static facial data, or whether this data can be accessed for less serious crimes that don’t require a warrant.

It’s almost as if law enforcement in both countries have discovered a new tool to make their job easier but want to use it on the quiet, with as little fuss as possible, and hopefully without all these messy, inconvenient civil rights questions and all those tiresome protests.


3 Comments

I wonder who is getting kickbacks for buying equipment that doesn’t do it’s job. Maybe time to look up who owns interest in these AFR companies.

I have been afraid of this for a long time and already certain estate agents take your photo when you sell a property (not ones I will use again!) and certain banks take your photo when you open a bank account. All without notice or asking your permission. I have also been photographed by the police just on the side of the road talking to someone taking part in a demonstration during a break when they were letting the built up traffic through. It was a perfectly well ordered demo and one I agreed with. They were demonstrating against the casualisation of labour after a student doing a holiday job got decapitated by an aggregate grabber on his second day into the job. He hadn’t been given proper instructions and the safety chain had been left off. Most sane people would think that was worth demonstrating against so why photograph?

Comments are closed.

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?