Over 75% of US households will be at risk to get hacked via voice assistants by 2025

Edward G. | March 09, 2020

Recent Atlas VPN findings show that voice assistants can get hacked by using ultrasonic waves to imitate voice commands. Ultrasonic waves do not make a sound, which means that the device can get hacked without alerting the owner.

In addition, scientists can imitate voice commands using lasers. Lasers trigger movements in the microphone’s diaphragm, and smart speakers interpret such movements as voice commands. The study reveals that 75% of households in the US will have a voice-assisted speaker by 2025.

People enjoy convenience in their daily lives, which is why smart assistants are trending. Currently, you can find a smart speaker in 4 out of 10 homes in the US. Voice assisted speakers took the market by storm. In 2014, only half percent of US households had them compared to 40% at the moment.

Over 75% of US households will be at risk to get hacked via voice 1

It is estimated that the smart speaker market will almost double in the next 5 years. If the popularity of smart speakers continues to grow, 3 out of 4 US households will have one by 2025.

In February 2020, a team of researchers hacked voice assistants on 15 of 17 popular smartphones.

Main research findings

Firstly, ultrasound can travel through solid surfaces to activate voice assistants. The waves are able to move through metal, wood, silicone rubber, and glass. In the experiment, the actual device transmitting ultrasonic waves was under the table, completely undetectable.

Secondly, it is possible to hear what the assistant answers and to continue the interaction. To do so, the first command sent to the device turns down the volume to a minimum. Then, the assistant’s responses get drowned in the noise of a busy street or a cafe.

Yet, sensitive microphones can intercept and amplify the response sound. Hence, hackers could carry out the attack while the phone owner is completely unaware. Using this approach, researchers were able to read messages, take photos, and make calls on the victim’s device.

This raises security concerns, as some banks use message authentication to access the account. Overall, findings lead to many new hacking opportunities.

The team tested both Apple (Siri) and Android (Google assistant) devices. Here is a list of the hacked devices:

Over 75% of US households will be at risk to get hacked via voice 2

Not all devices use the same type of microphones; as a result, different frequencies are used to activate voice assistants. It is also apparent that there is no difference between iOS and Android devices. iOS devices are generally regarded as safer, but this method completely steps overall firewalls. Researchers believe that with some adjustments, most smart devices in the market could get hacked using this method.

However, it is possible to stop the ultrasonic waves from issuing voice commands. Something as simple as a non-woven tablecloth distorts the waves and renders them useless.

Similarly, researchers at the University of Michigan were able to hack smart devices with lasers. Light triggers movement in the microphone’s diaphragm and smart assistants interpret such movements as voice commands.

The setup does require specific and rather expensive equipment. A telescope is used to find the smart speakers and to aim the laser directly to it. A laser source is needed to activate the voice command. People who have a smart home hub should pay attention. Scientists were able to unlock the victim’s smart-lock home doors by using this approach.

Over 75% of US households will be at risk to get hacked via voice 3

Additionally, researchers were able to use lasers to unlock and start Tesla and Ford cars. Most modern vehicles have dedicated apps for smartphones that control the car remotely. This software usually comes with voice controls.

By directing the laser to the phone and imitating voice commands, both cars were unlocked and started. All vehicles with dedicated applications and voice controls should yield the same results.

Hey Siri, stop listening to me

The biggest issue with voice assistants - they are listening day and night. It is true that Siri or Google Assistant only answer when you mention their names. Yet, to detect when their names are mentioned, they have to listen 24/7.

Currently, hackers use less complicated and cheaper methods. But, if cybersecurity technologies advance and leave hackers empty-handed, cybercriminals will have to explore new horizons.

Edward G.

Edward G.

Cybersecurity Researcher and Publisher at Atlas VPN. My mission is to scan the ever-evolving cybercrime landscape to inform the public about the latest threats.



© 2024 Atlas VPN. All rights reserved.