Threats from voice assistants


The time has come when the proverb „walls have ears” may no longer be a metaphor.

„The telescope served simultaneously as a receiver and transmitter, sensitive enough to pick up any sound louder than a low whisper … Of course, no one knew if it was being watched at any given moment” – this is the description of Big Brother’s spy devices in George Orwell’s novel 1984.

What if not only Big Brother had access to the telescreen? When anyone with the right skills could eavesdrop? What if this screen was used not only for political propaganda but also for displaying personalized advertisements? For example, you say you have a headache, and at the same time an ad for a headache remedy is displayed… This is no longer a dystopian theme, it’s almost a reality – there’s a good chance it will soon be that way.

After all, we have surrounded ourselves with the prototypes of such telescopes, and their new functions – such as voice assistant – can undoubtedly pose various risks.

Virtual assistants like Apple Siri are found in smartphones, tablets and laptops, as well as in stationary devices like Amazon Echo and Google Home smart speakers. They are used to turn the speakers on and off, check the weather forecast, change the temperature in the room, buy online, etc.

Can these watchful microphones hurt us? Undoubtedly yes. The first thing that comes to mind is a leak of personal and business information. However, it is easier for cyber criminals to get money in a different way: by entering credit card numbers and one-time passwords in forms on different pages.

Smart speakers can pick up your voice even where it is not particularly quiet, not even background music will be an obstacle. You don’t need to speak unnaturally clearly to be understood: in my experience, Google’s assistant on a popular Android tablet sometimes understands 3-year-olds better than their parents.

Here are some stories that may sound funny but should make you think. They are all related to voice assistants and smart gadgets. Science fiction writers have long dreamed of devices to which we can talk, but even they could not imagine that this would be the reality.

The loudspeaker revolt

In January 2017 in San Diego, California, CW6 reported  vulnerabilities in Amazon Echo speakers (including a virtual assistant, Alexa).

As the guest of the show explained, the system cannot distinguish between people by voice, which means that Alexa is following the commands of everyone in the vicinity. As a result, a child can buy something online without seeing the difference between asking their parents for a snack or asking Alex for a toy.

One of the guests said on the air: „I liked it when one girl said Alexa, she will order me a dollhouse . ” Suddenly there was a wave of submissions: San Diego residents reported spontaneous purchases of doll houses made by their voice assistants. It turned out that Alexa heard the words spoken on the television and considered them to be an order that he quickly implemented.

Amazon assured victims of the „artificial intelligence rebellion” that they could cancel their order and not have to pay for it.

The gadgets will testify under oath

Gadgets that can listen are valuable to law enforcement because they can usually repeat anything they heard. Below is a story that took place in Arkansas in 2015.

Four men decided to dance. They watch football, drink, relax in the jacuzzi – nothing fancy. The next morning, the owner of the house discovers the body of one of the guests in the bathtub. He quickly becomes the number one suspect; the rest of the guests say everything was fine by the time they left the party.

Investigators noticed that there are many smart devices in the house: lighting and security systems, a weather station – and the Amazon Echo. The police decided to take the chance. Hoping that the voice recordings would reveal what happened on the fateful night, the police asked Amazon to provide the data, but the company allegedly refused.

According to Amazon developers, the Echo does not record any sounds until the user says the default word: Alexa . The command is then stored on company servers for a specified period of time. Amazon says it only stores commands to improve the functionality of the functionality offered, and users can manually delete all recordings in their account settings .

Detectives found other devices from which they could get clues. They entered into the evidence records … an intelligent water meter. After the victim died in the early morning hours, an unusual amount of water was used. The owner of the house claimed he was asleep at the time. Investigators suspect that the water was used to clean the blood.

It is worth noting that the meter readings seemed inaccurate. In addition to the very high water consumption in the middle of the night, it indicated that the water consumption did not exceed 40 liters per hour on the day of the event, but you do not fill the hot tub with that amount. The accused owner was interviewed by  (yes, this is a site created by opponents of smart meters): he suspected that the meter was set to an incorrect time.

The case went to court this year.

Virtual assistants in movies
(Spoiler alert!)
Contemporary mass culture also treats virtual assistants with suspicion. For example, in the movie Passengers, the  android bartender Arthur reveals Jim Preston’s secret and destroys his reputation in front of his companion Aurora. In turn, in the movie  Why Him?  Justine’s voice assistant overhears the protagonist’s Ned Fleming phone calls, revealing his secret.

Car as a wiretap

Forbes also reported  some interesting cases where electronic devices have been used against their owners.

In 2001, the FBI obtained approval from a Nevada court to seek support from ATX Technologies to intercept a conversation taking place in a private car. ATX Technologies creates and manages car systems that enable car owners to call for help in the event of a road incident.

The company complied with the request. Unfortunately, no technical details have been released other than the FBI’s request that the entire operation have a „minimal impact” on the quality of the services provided to the suspect. It is quite possible that the wiretap was carried out using the emergency link and the car’s microphone was turned on remotely.

A similar story took place in 2007 in Louisiana. The driver or passenger of the car accidentally pressed the emergency button and called the OnStar ambulance. The operator answered the call, but as there was no response, he informed the police. He later tried again to get in touch with possible victims and overheard a conversation that might have been about selling drugs. The operator shared this conversation with a police officer and indicated the car’s location. As a result, the police stopped the car in which they found the marijuana.

The driver’s lawyer tried to invalidate this evidence, arguing that the police did not have a warrant. However, the court rejected this argument, responding that it was not the police who initiated the wiretapping. The suspect bought the car from the previous owner a few months before the incident and probably did not know about this feature. Ultimately he was found guilty.

How to stay out of the aether

In January, at the CES 2017 conference in Las Vegas, almost every smart item on display – from cars to refrigerators – had a virtual assistant function. This trend is sure to bring new threats to our privacy, the security of our data, and even physical security.

Every developer must make user safety a priority. In turn, we give users a few tips to protect themselves from ears that can hear everything.

  1. On Amazon Echo and Google speakers, disable the microphone using the physical button. It’s not a very convenient way to ensure your privacy – you’ll always have to remember to disable your assistant – but that’s something.
  2. Use your  Echo account settings to lock down purchases or set up password protection.
  3. Install anti-virus protection on computers, tablets and smartphones to reduce the risk of data leakage and attack by cybercriminals.
  4. If someone in your home has a name that sounds like „Alexa,” change the Amazon Echo wake word . Otherwise, any conversation near the machine can become a real nuisance.

This is not a one-way street

You taped the camera on your laptop with tape, put the smartphone under the pillow and threw out the Echo speakers. You feel free from electronic wiretaps, but it’s just an illusion. Researchers from Ben-Gurion University (Israel)  found that even ordinary headphones can become eavesdropping devices .

  1. Headphones and passive speakers are usually an inverted microphone. This means that any headset connected to the computer can detect sound.
  2. Some audio systems may change the function of the audio port at the software level. This is no secret – it is written in the motherboard specs.

As a result, cyber criminals can turn your headphones into an eavesdropping device to secretly record audio and transmit it to specific servers over the internet. Relevant research has shown that in this way, from a distance of several meters, you can record a conversation and the recording will be of an acceptable quality. Here it is worth remembering that people often keep their headphones much closer, around their neck or on their desk.

To protect yourself from such an attack, use active speakers instead of headphones and passive speakers. Active loudspeakers have a built-in amplifier between the input and the loudspeaker, which stops the signal from returning to the input.