Christmas is coming and many of us will no doubt be looking forward with excitement to the array of new tech that our stockings are set to be filled with. Smart speakers are reckoned to be the hottest Christmas gift of 2017 according to numerous media reports. The likes of Amazon Echo and Google Home, and the forthcoming Apple HomePod, certainly seem to be the next must-have device.
Working via a simple voice command and springing to life on hearing their personal ‘wake word’, they can do everything from shuffling a music playlist and providing the latest weather forecast through to ordering pretty much all you could wish for and keeping you on track with updates from your personal calendar. Synced with other devices, they will also control your home heating and security and switch your lights on and off.
All good. However, if you’ve got one of these voice assistants on your Christmas wish list, or indeed already have one, then there’s something you need to know.
A report in the Independent has revealed that popular voice assistants including Alexa and Siri are easy to hack due to huge design flaws.
The report reveals that researchers were able to take over seven different voice recognition systems on various gadgets including iPhones, Samsung Galaxy handsets and Windows 10 computers. The research also revealed that voice assistants can be triggered by voice commands that are inaudible to humans.
A total of 16 different devices were found to be vulnerable, however the researchers have stated that their list was “by far not comprehensive”.
It was discovered that these assistants can be triggered by voice commands that are actually inaudible to humans. Whilst an attacker would need to be close to the target device, it has been proven that it is possible to take over a voice assistant without touching it.
An ultrasonic transducer (a device that sends and receives ultrasonic sound over the airwaves) together with an amplifier were used to convert regular voice commands into ultrasounds: something that cannot be audibly detected by humans. In doing so, not only were the researchers able to active the voice assistants, they were also able to give them commands.
“By injecting a sequence of inaudible voice commands, we show a few proof-of-concept attacks, which include activating Siri to initiate a FaceTime call on iPhone, activating Google Now to switch the phone to the airplane mode, and even manipulating the navigation system in an Audi automobile,” the researchers said.
The seriousness of these findings cannot be over-emphasised. Access of this nature could allow an attacker to open a malicious website; launch a phone or video call for spying purposes; create and spread spam emails, social posts, events and text messages and disconnect wireless communications.
Furthermore, with devices like the Amazon Echo which can be connected in to a smart home set-up, there is even the risk that attackers could open a victim’s door to let intruders inside. This is not completely straightforward however as such actions require a PIN and the command must come from someone who is no more than 165cm from the device.
There are ways to protect yourself from voice assistant security risks. If you are using Siri or the Google Assistant, all you need to do is switch off the always-on setting. For the Amazon Echo, just hit the mute button. However, you will of course find that waking your voice assistant is no longer just a case of using its wake word.
It is advisable to switch voice assistant microphones off at least when you are not at home and most definitely when you are away for extended periods. In fact better still, unplug the device and secure it in a safe or locked cabinet when you are leaving your home unoccupied for longer than your working day.
If you are in any way concerned about the security of your modern home technology and smart devices, talk to the experts at IQ in IT. We provide specialist assistance to businesses and individuals seeking to protect their data and safeguard what matters to them.