Recently researchers at AVG investigated how easily Siri could be fooled by voices other than the device’s owner. In addition to using human voices, they also used synthesized voice commands, and found that Siri, and the virtual assistants available for Android phones do not discriminate between commands from their owner, and commands issued by others. They said:
However, these voice recognition technologies – that are so necessary on smart devices – are perhaps not as secure as we give them credit for. After all, they are not configured to our individual voices. Anyone can ask your Google Now to make a call or send a text message and it will dutifully oblige – even if it’s not your voice asking.
What if your device is vulnerable to voice commands from someone else? What if it could call a premium number, send a text message abroad, or write an email from your account without your knowledge. Over–the-air-attacks on voice recognition technologies are real, and they are not limited just to smartphones. Voice activation technologies are also coming to smart connected devices at home, like your smart TV.
It would be a good idea if the companies that are introducing these voice command technologies would take the security aspects of this feature into account, and give us a way to lock the voice command feature to the sound of our own voice, not just any voice.
For more information or a video demonstrating this issues, please click through to the AVG blog.Share