Skip to content
Naked Security Naked Security

AI-generated ‘skeleton keys’ fool fingerprint scanners

Artificial intelligence can be used to 'grow' fake fingerprints that pack in common features, fooling scanners.

We’ve had fake videos, fake faces, and now, researchers have developed a method for AI systems to create their own fingerprints.
Not only that, but the machines have worked out how to create prints that fool fingerprint readers more than one time in five. The research could present problems for fingerprint-based biometric systems that rely on unique patterns to grant user access.
The research team, working at New York University Tandon and Michigan State University, used the fact that fingerprint readers don’t scan a whole finger at once. Instead, they scan parts of fingerprints and match those against what’s in the database. Previous research found that some of these partial prints contain features common to many other partial prints. This gives them the potential to act as a kind of skeleton key for fingerprint readers. They are called MasterPrints.
The researchers set out to train a neural network to create its own MasterPrints that could be used to fool fingerprint readers into granting access. They succeeded, with a system that they call Latent Variable Evolution (LVE), and published the results in a paper.
They used a common AI tool for creating realistic data, called a Generative Adversarial Network (GAN). They trained this network to recognize realistic images by feeding it lots of them. They do the same with artificially generated images so that it understands the difference between the two. Then, they take the statistical model that the neural network produces as it learns, and feeds it to a generator. The generator uses this model to produce realistic images and repeats the process so that it can get better at it.


The researchers took these generated images and tested them against fingerprint matching algorithms to see which got the best results. It then used another algorithm to evolve the fingerprint to make those results even better.
In effect, the AI system is using mathematical algorithms to grow human fingerprints that can outsmart biometric scanners.
The team used two datasets to train its fingerprint generator: a set of traditional rolled ink fingerprints, and a set of fingerprints captured by capacitive readers like those found in smartphones. The capacitive fingerprints produced better results.
Biometric systems like fingerprint readers can be set to different security levels by adjusting their false match rate. This is the percentage of incorrect fingerprints that it would approve. The research team tested fingerprint reading algorithms at a 0.1% false match rate, which should mistakenly approve the wrong fingerprint one time in every thousand. The fingerprint reader accepted its generated MasterPrints, which it calls DeepMasterPrints, 22.5% of the time.
The researchers said that the LVE method seemed to be producing partial fingerprint images that contain enough common characteristics to fool fingerprint readers at rates far higher than average. They added that these artificial prints could be used to launch a practical attack on fingerprint readers.

Experiments with three different fingerprint matchers and two different datasets show that the method is robust and not dependent on the artifacts of any particular fingerprint matcher or dataset.

This is all a little worrying, if someone is able to spoof your fingerprints, then they don’t have to steal them (and if they do, you can’t upgrade or change your fingerprints). If someone developed this into a working exploit, perhaps by printing the images with capacitive ink, it could present problems for many fingerprint recognition systems.

4 Comments

It seams like common sense that biometrics are the ultimate failure for security for the reasons you mention: “if someone is able to spoof your ” After all, its just data to the systems. Just feed systems the data it expects, even if you have to inject it into the wire bypassing the scanning device.

There is another flaw – you can’t change you biometrics. The next time Facebook has a breach do they send an email to “please replace your fingers”?

This is a very clever hack and shows the weakness of fingerprint scanners since attackers no longer have to steal your actual fingerprint like the old super glue and gelatin trick but can simply make up a set of generic prints and test them until it works.
Aside from that, in the US there is established case law that the police can use your finger to unlock your phone, but cannot force you to give up a password to unlock a device so it behooves you to make them go before a judge before they can unlock your phone.

This is why MFA is important, preferably including different TYPES of factors, not just multiple of the same type of factor. Securing systems where high security is important with MFA should include at least 2 (if not 3) of the following:
• Something you know (eg a password)
• Something you have (eg an access card, a phone, an RSA token etc)
• Something you are (eg your fingerprint, an iris scan etc)

Comments are closed.

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?