Siri And Alexa AI Vulnerable To Ultrasonic ‘Dolphin Attack’ Hijacking Commands

It turns out there is a very real downside to the many digital assistants that were designed to make our day-to-day lives easier. Researchers at Zhejiang University discovered it is possible to flex control over Siri, Alexa, and others using a technique called DolphinAttack. By translating voice commands into ultrasonic frequencies that are inaudible to the human ear, a hacker could instruct someone's smartphone or other device with a digital assistant to perform any number of tasks, such as visiting a malicious website.

Apple and Amazon are the not the only companies affected by this. The vulnerability that makes this possible also affects voice assistants from Google (Google Now), Microsoft (Cortana), Samsung (S Voice), and Huawei (HiVoice). Since Samsung's Bixby assistant is based on S Voice, we assume it is affected as well, though the researchers did not make that clear.

Siri
Image Source: Flickr (Kārlis Dambrāns)

Using this technique, attackers can not only activate basic commands, but also instruct iPhones to FaceTime a contact or phone number, or a Windows PC to visit a compromised website that delivers malware in the background, to name just examples. Anything these increasingly capable digital assistants can be instructed to do, can be compromised by a third-party. An attacker could even redirect the navigation system in an Audi Q3 using the DolphinAttack, the research paper (PDF) states.

This is a cheap hack, too. The researchers tested an attack scenario using a Samsung Galaxy S6 Edge with less than $3 worth of parts, including an ultrasonic transducer, low-cost amplifier, and battery. With gear in hand, the researchers were successful in compromising digital assistants on a wide range of phones and gadgets, including several iPhone models, an Apple Watch, LG's Nexus 5X and Nexus 7, a MacBook, Lenovo's ThinkPad T440p, and more.

Depending on the specific gadget and situation, some attacks were only successful when performed within a few inches of the device, as was the case with the Apple Watch. But in other case, the attack could was accomplished from several feet of the device.

Amazon Echo

We should also point out that not all of the potential threats are feasible. For example, using the DolphinAttack technique, an attacker could instruct an Echo device that is part of a smart home system to "open the backdoor." However, the attacker would need already be in the house and near the Echo speaker to do this, at least as tested.

This is all made possible through a combination of hardware and software vulnerabilities. Both the microphones and software that power most digital assistants can detect inaudible frequencies, such as those above the 20KhZ limits of the human ear. There are challenges in altering the hardware to be immune to high frequencies, but in theory, Apple and others could update their digital assistants to ignore commands above a certain frequency.

Thumbnail Image Source: Flickr (iphonedigital)