Skip to content
Naked Security Naked Security

Nobody boogies quite like you

Our unique dancing style can be used by a machine-learning model to ID us, regardless of musical genre. Unless it's Metal. We all headbang.

That spasmodic jerking around that some of us refer to as “dancing?”
It’s the latest biometric: we can be identified by our twerking, our salsa, our rumba or our House moves with an impressive 94% accuracy rate, according to scientists at Finland’s University of Jyväskylä.

To be specific, the researchers asked 73 volunteers to dance to eight music styles: Blues, Country, Dance/Electronica, Jazz, Metal, Pop, Reggae and Rap. The dancers weren’t taught any steps; rather, they were simply told to “move any way that felt natural.”
Their study, described in a paper titled Dance to your own drum, was published in the Journal of New Music Research last week.
Identifying people by their dance moves is not what the researchers were after. They had set out to determine how music styles affect how we move:

Surely one does not move the same way in response to a song by Rage Against the Machine as to one by Bob Dylan – and research has indeed shown that audio features extracted from the acoustic signal of music influence the quality of dancers’ movements.

The original question: could they determine the style of music just by watching how people are dancing? Previous research has indicated that you can: low-frequency sound generated by kick drum and bass guitar relates to how fast you bop your head around, while high-frequency sound and beat clarity have been associated with a wider variety of movement features, including hand distance, hand speed, shoulder wiggle and hip wiggle. Dancers also increase their movements as a bass drum gets louder. Jazz is associated with lesser head speed.
It could all have to do with music’s audio features, but then again, cultural norms tell us how we’re supposed to move. Jazz? Let’s swing dance! Metal? HEADBANG!
In short, testing the idea that different music will elicit different movement patterns from listeners is complicated.
There’s already a fairly large body of work using machine learning to differentiate between musical genres. Work has also been done regarding how humans identify individuals based on their distinctive bodily movements.
Building on that previous work, University of Jyväskylä researchers set out to similarly use machine learning to explore the degree to which genre can be distinguished from volunteer dancers’ bodily movements.

They designed a 12-camera optical motion-capture system to collect free dance movement data from participants moving to commercially available music from eight different genres. They also employed a machine learning model to do two things: identify participants and music genre.
The upshot: how we move our head and limbs are our dance fingerprints, or what the paper refers to as our “motoric” fingerprints – we move in mathematically similar ways regardless of what kind of music we’re bopping to.

In theory, different individuals’ movements may covary differently between any markers in any dimensions, but as it is highly unlikely that participant’ were consciously controlling these aspects of their movements, the fact that these movement features could be used to accurately classify individuals across various musical stimuli suggests that we each have our own ‘motoric fingerprint’ which is evidenced in our free dance movements, regardless of what music is playing.

But while we can be identified by how we dance to any type of music, how we dance doesn’t tell anybody what kind of music we’re listening to. Despite researchers’ expectations, their machine-learning model did a lousy job at analyzing somebody’s movements to figure out what music they were listening to. Specifically, at its best, their model’s accuracy rate was less than 25%. That’s “well below” accuracy rates for most models that classify genre from acoustic signals, according to the paper.
Once the researchers’ model had established which person danced in which way, it could subsequently identify them, based on only their dance moves, with 94% accuracy. That rate varied based on genre, though: for example, the model had a tougher time identifying individuals who were dancing to Metal. That could be because most people won’t choose to do their own, individualistic moves, the researchers suggested. Instead, they’ll adopt the stereotypical moves – like headbanging – that the Metal culture has widely adopted.
Worried that the FBI is going to start asking you to cut a rug at the airport? That’s not what we were after, according to Dr. Emily Carlson, first author of the team’s paper. What she told New Atlas:

We’re less interested in applications like surveillance than in what these results tell us about human musicality.

Latest Naked Security podcast


Click-and-drag on the soundwaves below to skip to any point in the podcast.


I was worried it was going to be the “new” biometric. Dance to unlock your phone, your car, to get into your office PC. Got a bad foot, or aches, no driving no working, and if your CC is set up for it, you dance or you don’t eat!
Ahh but we’re safe. Nothing beats a passcode that you keep in your head, and can change.
But watch our for someone stealing your dance moves, just in case lol


Interesting research. But any research on “human” science will be depended and associated by philosophical, biological, social, and cultural aspects; it could be “accurate” today but not necessary 10 or 50 years later. Just thinks if Metal music was heard by a Amazon Forest nation man, I wonder whether he would headbanging. The same if you heard Stravinsky’s The Rite of Spring, what will you “dance”?


Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!