【鸟鸣声也许会被利用来攻击AI语音助手】

【鸟鸣声也许会被利用来攻击AI语音助手】根据德国波鸿Ruhr-Universitaet研究人员的说法,从鸟鸣声中操纵音频波可以用来发动对语音助手的攻击。被操纵的音频文件目的在于绕过人耳检测,混淆使AI助理得以正常运行的深度神经网络。现阶段还没有太多的办法去预防这种潜在的威胁,但也不用太担忧,因为这种攻击的实现非常困难;不过,用户可以选择安全设置来防止外部访问敏感信息。

The sound of birds chirping can be used to hack voice assistants like Alexa

According to the researchers, the manipulated audio files are part of what is called an “adversarial attack,” which is designed to confuse the deep neural networks that help artificial intelligence-powered assistants like Apple’s Siri, Google’s Assistant, and Amazon’s Alexa function.

Using the sound of birds chirping — or edited versions of songs or human speech — manipulated in a way that only the microphone on your smart speaker or smartphone can pick up, the attack bypasses detection from human ears and begins meddling with the A.I. assistant. What sounds like a bird’s song could actually be one of these attacks with hidden commands being delivered to your voice assistant of choice.

The researchers suggest the attacks, which use psychoacoustic hiding to mask their true intentions, could be played via an app or hidden in another type of broadcast. For instance, the sound could be hidden in a commercial that plays on TV or the radio to hit thousands of targets at once.

“[In] a worst-case scenario, an attacker may be able to take over the entire smart home system, including security cameras or alarm systems,” the researchers wrote, per Fast Company. They also put together a demonstration to show how such an attack could be used to deactivate a security camera.

There is a catch to this theoretical attack: the researchers have not launched it through a broadcast yet. Instead, they have fed the doctored files that contain the hidden audio command directly into the assistants so they hear the message clearly. However, the scientists are confident that the attack could be carried out through another means. “In general, it is possible to hide any transcription in any audio file with a success rate of nearly 100 percent,” they concluded in their paper.

There aren’t a lot of ways to defend against such an attack, which, to be clear, would require a fair amount of effort for someone to execute and is unlikely, even if possible. Use the security settings on your voice assistant of choice to prevent any access to sensitive information. On Alexa, for example, you can require a PIN to be provided before completing a purchase.

原文链接:https://www.digitaltrends.com/home/voice-assistants-hacked-adverserial-attack-birds-chirping-alexa/


Comments are closed.



无觅相关文章插件