business news in context, analysis with attitude

The New York Times has a story about the potential - and, to be fair, actual - vulnerability of various smart speaker systems.

An excerpt:

“In the past two years, researchers in China and the United States have begun demonstrating that they can send hidden commands that are undetectable to the human ear to Apple’s Siri, Amazon’s Alexa and Google’s Assistant. Inside university labs, researchers have been able to secretly activate the artificial intelligence (AI) systems on smartphones and smart speakers, making them dial phone numbers or open websites. In the wrong hands, the technology could be used to unlock doors, wire money or buy stuff online — simply with music playing over the radio.”

The story goes on:

“This month, some of those Berkeley researchers published a research paper that went further, saying they could embed commands directly into recordings of music or spoken text. So while a human listener hears someone talking or an orchestra playing, Amazon’s Echo speaker might hear an instruction to add something to your shopping list.”

The implications are enormous:

“Smartphones and smart speakers that use digital assistants such as Amazon’s Alexa or Apple’s Siri are set to outnumber people by 2021, according to the research firm Ovum. And more than half of all U.S. households will have at least one smart speaker by then, according to Juniper Research.”

Amazon, Google and Apple all say that security is a high and continuing priority.

You can read the entire story here.
KC's View:
I have to admit that I find this story to be a little alarming. Not so much so at the moment that I’m going to turn off all my various Alexa-powered devices, and I’m unwilling to suggest that these issues are going to cause this technology to crash and burn.

I do think that whatever Apple, Amazon and Google are doing about security, they clearly have to do more.