contador web Saltar al contenido

Malicious commands for Siri and other assistants can be hidden in audio clips

We constantly talk here about threats of invasion of privacy, sophisticated virtual attacks and high-encryption equipment that employ old woman arc techniques to unlock iPhones outside. But what if I say that there are much more awkward ways to enter your digital life?

A report published today in New York Times shows that malicious agents can send ?secret? commands to digital assistants like Siri or Alexa, hiding them in music, voice clips or noise so that they are imperceptible to human audio. What?!

Siri now asks to unlock the phone to read the notificationsiPhone X | Placeit

The discovery is the result of research carried out by Chinese and American scientists in the fields of acoustics and artificial intelligence. It all started about two years ago, when researchers from the Universities of Berkeley and Georgetowm, both in the USA, discovered that it was possible to mask a voice command for digital assistants in the middle of a white noise clip that noise of random waves, such as that of an analogue TV with no signal.

Since then, other researchers have advanced their studies, proving the possibility of incorporating these commands not only in noise clips, but in excerpts of a speech or music. The whole process based on the different ways that the human ear and the microphones of the devices ?hear? and recognize the speech and sounds of the environment; by making small changes to the audio files that will be tampered with, the researchers are able to replace what the machines should hear with different things although, to our ears, everything looks absolutely normal. ?

A team from Zhejiang University in China demonstrated the attack on video:

And what is the danger of that? Well, because it is a totally imperceptible threat, it can act at times when your device is exposed or unlocked and then perform commands of all kinds, from humorous (like asking your HomePod to touch the Zap ? moan) simply dangerous (like telling your iPhone / iPad to access a malicious website).

Researchers have no information as to whether this type of attack has already been used in the real world for malicious purposes, but one of them Nicholas Carlini, from the University of Berkeley believes it will be a matter of time before the method starts to be applied if it is no longer being used. .

As I always reaffirm at the end of this type of article, in the case for us, mere mortals, we run out over the hills. In addition to considering that the specificity of this invasion method makes it basically exclusive to targets more targeted than us, basic care makes us almost immune to it: just do not go out by playing audio clips received randomly by strange contacts. Seems easy, doesn't it?

via Cult of Mac

Rate this post