Amazon Gives Convoluted Explanation For Alexa Recording Then Forwarding Private Conversation

Earlier this week we talked a bit about an issue with an Amazon Echo device that had recorded an audio clip of a woman's conversation and then set the clip to a random contact on her contacts list. The woman was rightfully fearful of the Alexa voice powered device after that incident, and had vowed to never use it again and wanted a refund. Amazon confirmed early on, according to reports, that it had gone through her logs and verified that the incident did, in fact, occur just as she claims.

alexa ai


Amazon has now reached out to HotHardware and other tech publications with an update on the situation and a rather convoluted explanation for how the incident took place. Amazon said, "Echo woke up due to a word in background conversation sounding like "Alexa." Then, the subsequent conversation was heard as a "send message" request. At which point, Alexa said out loud "To whom?" At which point, the background conversation was interpreted as a name in the customer’s contact list. Alexa then asked out loud, "[contact name], right?" Alexa then interpreted background conversation as "right". As unlikely as this string of events is, we are evaluating options to make this case even less likely."

Amazon's Echo devices packing Alexa inside certainly aren’t the only voice-activated AI assistants on the market that can misunderstand things said in the environment and activate themselves with bizarre results. A perfect example is my personal iPhone SE. I have Siri enabled with the "Hey, Siri" wake-up command. I had it on the table in a work meeting last week and someone across the table loudly said "Hey, Shane," to which Siri promptly dinged and then replied, "I'm here".

Siri and Shane don't sound very much alike after the first hard "S" sound, yet the AI assistant was ready to do whatever it heard next. Siri has also been activated by things said on the radio, or while I’m having conversations on my work phone in the car with my personal phone in a pocket or lying in the console. Siri sometimes tries to search for random bits of conversations in the car and the first indication some strange is going on is hearing Siri tell me she can’t find any info on that. It's not clear what the answer is to stop these false activations and false orders AI voice assistants are prone to perform, short of making a physical button press the activation command. However, doing that would certainly make using these personal assistants much less convenient as hands-off operation is very much a part of the experience.