There’s a new contestant in the competition to see what kind of Hell-spawn, technologically enhanced doll can freak us out the most.
As New Scientist reports, this one’s fitted with a camera and an artificial intelligence (AI) chip that can interpret children’s emotions. Eight of them, to be precise, including surprise and happiness, which it gleans from a camera in the doll’s head.
The doll is just one of a host of devices equipped with computer vision that are described in a paper titled Eyes of Things from a team at the University of Castilla-La Mancha, in Ciudad Real, Spain.
New Scientist quotes project leader Oscar Deniz:
In the near future, we will see a myriad of eyes everywhere that will not just be watching us, but trying to help us. [As AI chips get cheaper], we will have wearable devices, toys, drones, small robots, and things we can’t even imagine yet that will all have basic artificial intelligence.
The paper describes a new computer vision platform called Eyes of Things that could enable new applications and technologies such as deep learning, drones, home robotics, intelligent surveillance, wearable cameras, and yes, intelligent toys.
The emotion-reading doll described in the paper differs from its progenitor Hell-spawns in that it doesn’t need to send data off for processing in the cloud, where the privacy of children comes into play and breaches threaten exposure of things like children’s data and voices.
We saw that happen with CloudPets teddy toys around Christmas, with all user accounts and potentially up to 2.2m voice messages compromised by hackers who found the data, unprotected, using nothing more complicated than the Shodan IoT search engine.
A doll called Cayla suffered from noxious cloud syndrome, too: for one thing, it had a software vulnerability that allowed Cayla to be programmed to say anything – from Hannibal Lecter quotes to lines from 50 Shades Of Grey. In addition, according to security researcher Ken Munro, any device could connect with the doll via Bluetooth and therefore communicate with your child.
Good times, good times. Germany’s Bundesnetzagentur, the telecoms watchdog, called Cayla an “illegal espionage apparatus” that parents should destroy.
Other dolls that have raised privacy concerns include the internet-enabled, speech recognizing, joke-telling Barbie.
“Hello Barbie,” it was dubbed. That was followed by “Hell No Barbie”: the social media campaign that called Hello Barbie an “eavesdropping doll” that raised privacy concerns because recordings of children’s conversations are stored by the company – ToyTalk – that makes the voice recognition technology.
The emotion-reading doll would focus on data from cameras, rather than microphones. But as New Scientist notes, many of the issues around data privacy would be the same.
Just because emotions aren’t being analyzed in the cloud doesn’t mean that the relevant data couldn’t wind up being intercepted. The team’s paper doesn’t go into detail about how such a doll would work – they came up with a chip and put it into a doll, but that doll apparently hasn’t been turned into a retail toy yet – but it does mention low-power wireless technologies such as Bluetooth.
Bluetooth, as in, the way that you could pwn Cayla and get her to talk about a nice Chianti and fava beans.
The paper also mentions a series of API-based activities, including examples such as
IF Angry face from EoT-1 email to my_address@my_company.com.
Oh? Meaning that the app/doll/platform/whatever device equipped with Eyes of Things computer vision chips can be set to recognize an angry face and trigger an email about it? Interesting!
The emotion-recognizing doll isn’t on shelves yet. It’s just a glint in the eye of its AI-programming parents at this point. Let’s hope that if and when it gets turned into a Christmas best-seller, all potential privacy and security issues have been ironed out beforehand.
We do not need another Seed of Chucky or Twin Sister of Cayla!