Hot Topics

Burger King Ad Creates Whopper of a Mess for Google Home

Burger King on Wednesday in essence hijacked the voice-activated Google Home speakers in some consumers’ homes.

In a 15-second television ad, the camera zooms in on a young man wearing the company uniform who says, “OK Google, what is the Whopper burger?”

The “OK Google” trigger phrase for Google’s artificial intelligence Assistant activated Google Home speakers situated nearby, prompting them to read Wikipedia’s description of the Whopper.

Burger King apparently intended to prompt the Assistant to deliver the glowing description posted on the page. However, the Internet quickly caught on to the gimmick, and Wikipedia’s Whopper page was deluged with newly edited versions, many of them decidedly uncomplimentary. Wikipedia soon blocked the editing functionality.

Within three hours or so, Google reportedly issued a server-side update to Google Home to stop it from responding to the ad. The ad would still wake up a Google Home device, which would wait for its query to hit Google’s servers, but Home no longer would respond to it. However, Google Home would respond to a real person making the same query.

The ad apparently was created by David The Agency.

Wikipedia’s Whopper Page Gone Wild

Someone with the username “Fermachado123” last week changed the Wikipedia entry for the Whopper to list its ingredients, according to reports. The owner of the handle may be Fernando Machado, Burger King’s senior vice president for global brand management, although Burger King apparently hasn’t confirmed or denied his involvement.

“Editing an article on behalf of one’s employer or company can create a conflict of interest and violate Wikipedia policies,” Wikimedia spokesperson Samantha Lien told the E-Commerce Times.

Wikipedia content and entries are determined by a community of volunteer editors.

Internet trolls struck minutes after the ad debuted at 12:00 p.m. ET, editing the Wikipedia entry to describe the burger variously as “cancer-causing” or “a chocolate candy”, and altering the ingredients list to include such items as “toenail clippings,” “medium-sized child,” and “rat.”

Google “could, and likely should, require people to customize the command phrase,” suggested Rob Enderle, principal analyst at the Enderle Group.

“The idea that a TV ad could generate a mass purchase should scare them more than it does,” he told the E-Commerce Times.

Following in Alexa’s Footsteps

“I’m kind of surprised they used Google Home rather than the far more prevalent Amazon Echo,” Enderle said.

A 6-year-old Dallas girl earlier this year asked the Amazon Echo Dot, which is powered by Alexa, if it could get her a dollhouse. Alexa was happy to oblige, and the child confirmed the order. She apparently also ordered cookies. A US$160 dollhouse and four pounds of cookies showed up at her home days later. Her mom laughed off the mishap and treated it as a reminder to set up parental controls.

However, Jim Patton, a news anchor at San Diego TV station CW6, several days later said, “I love the little girl saying ‘Alexa order me a dollhouse,'” during a newscast. His remark reportedly triggered numerous Echo devices in viewers’ homes to attempt to order dollhouses.

“We need more variance and better vocal security and recognition, particularly when we begin looping in security systems and locks, or they’ll unintentionally allow bad folks into our homes,” Enderle warned. “That could lead to a massive potential liability exposure for the related products, services, or companies that supply them.”

Consumers should “be careful what applications they use and what’s active when they use them,” cautioned Michael Jude, a program manager at Stratecast/Frost & Sullivan.

Smart Technology Risks

Technologies like Google Home and Alexa “have no innate judgment,” Jude told the E-Commerce Times. “You shouldn’t trust them to use judgment on which commands to respond to or what activities to launch.”

Google and others will need to focus on the applications behind the voice recognition systems, he suggested.

Natural language processing “doesn’t imply any real intelligence behind the interface,” Jude explained. “As the applications’ NLP system front ends become more intelligent, the opportunities for compromise decrease.”

In the Internet of Things environment, where you can have “an ecosystem or ecosystems of ecosystems interconnected, the attack vector universe is potentially limitless,” noted Laura DiDio, research director for IoT at 451 Research.

The risks are “everywhere, and what you can do is mitigate risk to an acceptable level,” she told the E-Commerce Times — but that requires vendors to make secure products.

Richard Adhikari

Richard Adhikari has been an ECT News Network reporter since 2008. His areas of focus include cybersecurity, mobile technologies, CRM, databases, software development, mainframe and mid-range computing, and application development. He has written and edited for numerous publications, including Information Week and Computerworld. He is the author of two books on client/server technology. Email Richard.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Related Stories
More by Richard Adhikari
More in Hot Topics

E-Commerce Times Channels