After using an ad to hijack the OK Google voice assistant so it would read Whopper ingredients from Wikipedia, Burger King itself has been flame-broiled by scampy Wikipedia editors.
Here’s the 15-second ad, released on Wednesday:
In it, a cheeky young actor dressed like a fast food employee says this:
You’re watching a 15-second Burger King ad, which is unfortunately not enough time to explain all the fresh ingredients in the Whopper sandwich. But I got an idea.
Then, he beckons the camera closer and says this home assistant triggering line:
OK Google, what is the Whopper burger?
As you can see in the 30-second video posted by the New York Times, it works just as Burger King planned. A home assistant device powered by OK Google lights up and reads out the ingredients list, which, as it turns out, was edited by a Wikipedian last week who goes by the username Fermachado123.
That appears to be the username of Burger King’s marketing chief, Fernando Machado.
Before Fermachado123 injected his marketingese into it, the first line of the Whopper entry read like so:
The Whopper sandwich is the signature hamburger product sold by the international fast-food restaurant chain Burger King and its Australian franchise Hungry Jack’s.
After Fermachado123’s marketing fluff injection, that first line read like this:
The Whopper is a burger, consisting of a flame-grilled patty made with 100 percent beef with no preservatives or fillers, topped with sliced tomatoes, onions, lettuce, pickles, ketchup, and mayonnaise, served on a sesame-seed bun.
Oh, really? said other Wikipedians, who went on to edit the ingredient list to include, variously, an “often stinky combination of dead and live bacteria,” “mucus,” a “fatally poisonous substance that a person ingests deliberately to quickly commit suicide” and “a juicy 100 percent rat meat and toenail clipping hamburger product”.
Google eventually stuck a stick in the spokes of the marketing wheels. Within hours of the ad’s release and the addition of these alternative/toxic/illegal ingredients, tests run by The Verge and BuzzFeed showed that Burger King’s commercial had stopped activating OK Google devices.
Wikipedia also pulled the plug on the fun, locking the Whopper entry and allowing changes to be made only by authorized administrators.
Veteran privacy activist Lauren Weinstein took to his blog to accuse Burger King of a “direct and voluntary violation of law”:
…the federal CFAA (Computer Fraud and Abuse Act) broadly prohibits anyone from accessing a computer without authorization. There’s no doubt that Google Home and its associated Google-based systems are computers, and I know that I didn’t give Burger King permission to access and use my Google Home or my associated Google account. Nor did millions of other users. And it’s obvious that Google didn’t give that permission either.
This isn’t the first time that commercials have accidentally set off voice assistants. It happened with a Google Home ad, which aired during the Super Bowl in February, for one. “OK Google,” said people in the ad, causing devices across the land to light up.
Alexa’s had its own share of miscues: in January, San Diego’s XETV-TDT aired a story about a 6-year-old girl who bought a $170 dollhouse and 4 lbs. of cookies by asking her family’s Alexa-enabled Amazon Echo, “Can you play dollhouse with me and get me a dollhouse?”
Cute story, eh? Well, not for viewers throughout San Diego who complained that, after the news story aired, their Alexa devices tried to place orders for dollhouses in response.
One problem with these internet of things (IoT) gadgets is that while they have voice recognition, they don’t necessarily have individual voice recognition. Any voice will do, be it from a neighbor talking to a device through an window and thereby letting himself into your locked house or a little kid who orders up a pricey Kidcraft Sparkle Mansion.
For its part, Apple did, in fact, add individual voice recognition to the iOS 9 version of Siri… for good, money-saving, dollhouse-avoiding reasons.