Naked Security Naked Security

Facial recognition reunites missing boy with his dad after four years

Tech to pick faces out of the crowd can be used for good, but it's also increasingly pervasive

On a hot morning in June, Yesong, a 14-year-old with Down’s syndrome, wandered away from his father, Junxiu Wang.

It would take nearly four years, an agonized parent, his unrelenting search, and facial recognition to bring Yesong home.

As Microsoft tells it, the technology in question – Photo Missing Children (PhotoMC) – was the result of a project that came out of Microsoft’s ­annual worldwide employee Hackathon. The idea first occurred to Eric Zhou, a Shanghai-based senior business intelligence manager for the Digital Crimes Unit in Microsoft’s corporate, external and legal Affairs group in China, two years ago.

His group uses data analytics to fight cybercrime and protect vulnerable populations, including children. During the hackathon, a friend of his, Kevin Liu from the Microsoft support engineering group, created the app to help find China’s missing children, of which there are tens of thousands.

It’s easy to see why searching through photos manually is exhausting for caseworkers: as of May 2017, there were 64,000 cases on the website of Baobei Huijia (Baby Come Home), a nonprofit organization dedicated to finding missing children.

Parents of missing children and those who come across children who might have gone missing or been abducted send photos that are posted by Baby Come Home. Take a look, and you’ll be able to spot at least one of the problems with making a match: some of the photos are old and faded. Many of the children will have aged considerably while they’ve been missing, as years, or even decades, go by.

In fact, founder of Baby Come Home, Baoyan Zhang, was initially skeptical when approached by Microsoft’s team:

Those children may have gone missing at the age of three or four, but they maybe are 20- or 30-something when we look for them. So after a few contacts, we felt it was a waste of time, and we declined some other cooperation requests after that.

Of course, we’ve known for years about Microsoft’s ongoing work on matching face images while factoring in subjects’ ages. In 2015, its age-guessing, face-recognizing tool went on a metadata-slurping, viral spree that delighted the team behind the machine-learning technology.

PhotoMC relies on Microsoft’s face recognition API to find matches for missing children. It’s a cloud-based service that scans images of faces for identifying features and determines the likelihood that two faces belong to the same person. The API analyzes 27 different facial characteristics and can identify a person across multiple photos, even at different angles (which Facebook, for one, can also do) and with varying facial expressions.

The technology is part of Microsoft Cognitive Services, a collection of tools that allow developers to add features such as emotion detection (yes, it can tell if we’re happy, angry or sad), vision and speech recognition, and language understanding to applications across devices and platforms.

Microsoft isn’t the only one using facial recognition when it comes to missing children: other tools include FaceFirst in the US.

In fact, Microsoft’s app isn’t even the first facial recognition technology that Baby Come Home has used. In 2013, it jointly developed a facial recognition app with advertising agency JWT China for the same purpose. According to JWT, two families were united in the first week following the app’s launch.

Forgive the JWT advertorial, but I’ll include it here because it puts the use of the technology into context with the emotional horror of losing a child. It also cites the reasons why a parent such as Yeson’s father, Junxiu Wang, would be frantic to find his missing child: such as children being sold into slave labor or being turned into street beggars.

There was a happy ending, eventually, with Yesong.

Baby Come Home ran the photo of Yesong that his father provided against 13,000 images on a new government site. Within seconds, PhotoMC came up with 20 possible matches. One was a boy living in a shelter some 24 miles from where Yesong went missing. A DNA sample would confirm his father’s hopes: Yesong had been found.

That was in January 2016. Since then, Microsoft’s PhotoMC has helped to reunite four other missing boys with their families. Other possible matches are being followed up on.

Obviously, there are compelling reasons to use facial recognition in emergency situations. The last thing we’d want to do is to detract from the joy of a parent being reunited with their child through the use of this technology.

But besides its beneficial applications, facial recognition is a surveillance technology. Like any surveillance technology, it can be misused. Neema Singh Guliani, a legislative counsel with the ACLU, noted that Microsoft, to its credit, is apparently using PhotoMC in a circumspect manner: it seems to be running searches against a small category of photos, as in, those of children in orphanages or government databases that are dedicated to missing children. It’s not, apparently, running the searches against broad public databases: a practice that can entail overreach by law enforcement.

In the US, that overreach has included amassing nearly half of all Americans – the vast majority of whom are innocent of any crime – in databases accessible by the FBI. It’s also entailed invitations to law enforcement to search Department of Motor Vehicles images databases, even when there are state laws that explicitly prohibit the practice. (Singh Guliani notes that following the ACLU’s demand last week that Vermont stop, the state has instituted a moratorium on the practice.)

So: good for Microsoft for limiting the facial image searches, at least as far as we can tell. There are other issues around facial recognition that are worthy of exploring, Singh Guliana noted, such as questions of how long image databases are retained. However, Microsoft wasn’t able to provide answers on these questions.

Yesong’s story had a happy ending. But it’s worth noting that facial recognition technology is being rolled out before privacy protections are put in place.

That’s a backwards way of doing it, says Singh Guliana.

While you can find the technology used for overwhelmingly positive purposes such as finding missing children, too often, it’s being misused for overly broad criminal searches.

It’s time to pause, take a breath, and get the privacy protections in place now.


Leave a Reply

Your email address will not be published. Required fields are marked *