Follow us

How to assure the privacy of your smart speaker if you can't get rid of it?

Almost every modern technology has vulnerabilities, but it doesn't mean that you can't use it at all - just make it secure.

Published: August 2, 2021 By Hamna Imran

How to assure the privacy and security of smart speakers if you can't get rid of it

 

With IoT came its by-products; the smart cities, which consist of smart homes that comprise smart devices. Smart speakers being one of them.

Recent surveys showed 91 percent of smart speaker users are concerned about eavesdropping by their speakers, and 90 percent of owners said they are concerned about data being unknowingly collected. Even with those concerns, 80 percent of consumers have used voice command technology on some sort of device (smartphones, tablets, computers, remote controls, smart speakers, and in-car).

These all-time wake devices are risk bound. For the users who do not want to trade their privacy against the convenience that comes along with the use of a smart device, keep reading.

 

Internet-enabled devices: Smart speakers

Voice assistants carry the software aimed for a hands-free experience. They reckon (audio) speech data analysis. A wakeup word has to be pronounced for activation. Once recognized, the voice assistant can record and process subsequent commands. These voice assistants can be integrated into multi-functional devices, such as smartphones or smart speakers.

These include Google Assistant, Amazon's Alexa, Microsoft's Cortana, and Samsung's Bixby.

Despite the fact that these smart speakers make a variety of tasks easier, from controlling other smart appliances to ordering food online to providing information. But being IoT devices, smart speakers have built-in minimal security measures. Although they are more secure than before, new vulnerabilities continue to emerge, and many security features are still optional. Without strict regulatory requirements, the protection of these devices has been left mainly to the user.


Major security concern: Hacking

Smart speaker network might be the greatest sources of convenience, it has considerable vulnerabilities to it. More the number of devices connected to the speaker., the more the chances of them being hacked. Consequently, smart speaker customers have to face the threat that an unknown third party might access their smart home as a result of the breach.

Although the probability of malicious actors gaining access to a smart speaker is low, news of private recordings being accidentally played to strangers has been heard. As smart speakers become more closely connected with various smart home devices (including doorbells or locks), the possibility of intrusion will only increase.

One of the researched intrusion ways is a hack through a laser beam.

In 2019, researchers demonstrated that hackers could control these devices from 360 feet away with a laser pointer. This is really important in terms of global home office mode because of COVID-19.

Hacking Amazon Echo through Laser from a window 110 feet away

 

Image source - sciencealert.com

A team hacked through windows up to 110 meters away using a kit that only costs a few dollars. This technique works through the electromagnetic system or MEMS built into the smart speaker microphone. These widgets can interpret light as sound, which means they can be controlled by something as simple as a Laser pointer. You can point the precise laser pointer at the speaker to simulate word triggering and fast recording.

When asked for comment, Amazon and Google said they were following up on investigations. According to all reports, this is not a practical technique worth trying seriously, but it should definitely be kept in mind.


Laser powers to hack different smart speakers
Device
Voice Recognition System
Minimun Laser Power at 30 cm [mW]
Google Home
Google Assistant
0.5
Google Home mini
Google Assistant
16
Echo
Amazon Alexa
25
iPhone XR
Siri
21

Another practically proved attack is a surfing attack.

A surfing Attack is an attack on Voice Assistants Using Ultrasonic Guided Waves. It was developed by a team of researchers from Michigan State University, the University of Nebraska-Lincoln, and the University of Washington. In St. Louis, Missouri, and the Chinese Academy of Sciences.

They showed how attackers can use slight voice commands to unlock the front door of a victim protected by a smart lock and even find, unlock and start multiple vehicles. Researchers found that these waves can propagate through many solid surfaces to activate voice recognition systems, and the person initiating the attack can also hear the phone's response.

Surfing Attack leverages ultrasonic guided wave in the table generated by an ultrasonic transducer concealed beneath the table

Image source - Research paper


Alexa, Google, Apple, which is safe?

When it comes to determining whether smart speakers and voice assistants are safe, the answer is more complicated than yes or no. It really depends on how well you know the product and how safe you make it. In early 2020, a bug caused Google Home devices to always record users without warning. This was an unexpected error that Google corrected later, but it disturbed smart speaker users. Even if your smart speaker only records it when you want it, you cannot be sure of the route this data takes.

In 2019, a video intercom company owned by Amazon allowed law enforcement agencies to access user pictures. A dangerous precedent for what Amazon can do with what you say to the Echo. The change that makes it easier to delete recordings comes from criticism of Amazon and other smart speaker manufacturers. In recent months, consumers purchase more identities from users under 13 who use the Echo Dot for kids. This criticism intensified. Some lawmakers are also acting.

Many smart speaker owners have no idea about the copies of data; these giants keep on hearing the wake word. All smart speakers like Apple's Siri, Google Assistant, and Amazon Alexa save recordings by default to help you teach AI.


Smart speaker privacy concerns

A research paper published by Malkin, Nathan & Deatrick, and others in 2019 on Privacy Attitudes of Smart Speaker Users asked all participants: In the past, have you had any privacy concerns about your device? Most participants (71.7%) said they had no concerns about their smart speaker. As with many privacy-focused surveys, a common refrain was, "I am not a person that really ever has privacy concerns. I have nothing to hide and nothing worth stealing.

Among the 28.3% of participants who said they had experienced privacy concerns, these were frequently caused by accidental activation: "There were times when the speaker would activate without me saying the wake word. This was a bit odd and it did leave me a bit uneasy."

Recording storage

The main users' concerns are related to their uncertainty on how data is collected, where it is stored, and how the data is processed. The primary goal of this data collection is obviously to be able to execute the users' command. Besides, the collected data can be used to analyze the collected audio samples for the identification of situations in which smart speakers couldn't cater to user's needs and improve customer service.

116 proprietors of Amazon and Google smart speakers participated in the survey, and it was discovered that nearly 1 / 2 of them were unaware that their recordings were being saved and 1 /4 claimed to review interactions, and only a few had ever deleted any.

Human reviewing

There is always someone pulling a lever somewhere behind the curtain, and sometimes people will even listen to your request for a weather report and try to figure out what you mean when you add mangosteen to your weather report.

Amazon employs thousands of employees, also outside the United States, to transcribe and comment on Alexa requests to improve the performance of the voice assistant. The company claims that this information is anonymous and encrypted, but a Bloomberg article stated that the records being analyzed by Amazon employees may include names, device serial numbers, and bank account numbers.

It is unclear whether all this information is linked in a way that employees can identify users or family members. In this case, Amazon Alexa provides users with the option to prevent screen users from listening to the recording. You can do this by going to the "Privacy" section of the settings menu and deactivating the use of your input by the Alexa developer. Amazon is not the only company doing this.

According to Belgian News, Google’s subcontractors are also listening. Some transcripts of Google Assistant’s voice recordings, many of which were accidentally recorded and may contain sensitive personal information. Google clarified that it uses "linguistic experts" for analyzing 0.2% of voice messages, but these records have nothing to do with users. 'And basically anonymous. Like Amazon and Google, Apple is now hiring employees to make Siri work better, and records can be kept for up to two years.

However, the company stated that all transmitted data is unidentified and therefore "cannot be traced back to you or your Apple ID." The company recently stated that it is "suspending" the use of human "testers" while verifying the process.

Someone who hears you may seem intrusive to you, but you may have agreed to this in the app's terms of use (which no one reads) when you first set up your device.

Data hoarding

The hidden benefit smart speaker companies avail through recording users' commands is the insight into user's life that can be used for both advertisements and profiling These insights include users’ current search interests, grocery lists, shopping activities, or routines. In the recorded samples, further insights about the users’ context can be gained.

Nevertheless, few users had the impression that online advertisements they saw were based on private conversations that they had at home, thus suggesting that providers are using the recordings to trigger more personalized advertisements. What is done with the recordings still remains uncertain to some users.


Smart speaker tips

Checking Privacy and Security settings

Existing Alexa and Google Home devices include some privacy controls. A survey on how people try to protect their privacy with regard to their devices and whether they take advantage of the offered controls. In the past, did you take any steps to protect your privacy when using your device?

Only 18.6% of respondents described taking any steps to limit their devices.

Check the security and privacy settings of your speakers. Explore the app's features — Google, Amazon, and Apple devices have their own security settings.

Delete commands

Delete command history so that they are not stored locally and in the cloud. They are needed for the speaker to better recognize your voice, but privacy is more important than occasional "misunderstandings". In most services, you can delete individual commands, commands for a selected period of time, or the entire history at once.

Deleting commands on Amazon Alexa

Amazon launched a feature making it possible for users to delete the commands for that day by saying, "Alexa, delete everything I said today." But that the feature is not as simple as it sounds. Points out Gizmodo. Firstly, you can only delete the recordings of the current day; secondly, you will have to dig into settings.

Step 1: Delete your voice history. This is necessary to keep your past conversations private.

  • Open the Alexa app on your phone.
  • Go to the menu on the top left of your screen.
  • Select Settings, then Alexa Account, and then Alexa Privacy.
  • Log into your account if you’re not already logged in.
  • Select Review Voice History.
  • Select one of the options under Date Range.
  • Mark the recorded interactions you want to delete.
  • Select Delete All Recordings.

Step 2: Prohibit your data from being used by Amazon to improve Alexa. This seems essential to prevent future eavesdropping. It was recently mentioned in a news story that Amazon has been hiring many people to listen to a random number of conversations on smart speaker devices for research and product development purposes. But, you can avoid this by following these steps.

  • Click Menu in top left.
  • On the Alexa Privacy options page, select Manage How Your Data Improves Alexa.
  • Toggle off Help Develop New Features.
  • Toggle off all names under Use Messages to Improve Transcriptions.

Deleting commands on Google Assistant

Step 1: Prohibit Google from accessing your Voice and Audio activity

  • Visit myaccount.google.com
  • Select Data & personalization on the left sidebar.
  • In the Activity controls section, for Voice & Audio Activity, select Paused.

Step 2: Delete audio activity

  • Scroll down and in the Activity and timeline section, select My Activity.
  • On the left sidebar, select Delete activity by. (On mobile, find it in the top-left corner.)
  • Click on the first pull-down arrow under Delete by date.
  • Select All Time.
  • Click Delete.
  • Select Delete when Google asks if you’re sure you want to delete all data.

Deleting commands on Apple

Here’s how you can delete voice marks on an iPhone — but you’ll need to repeat similar processes on every Apple device you own. You basically will delete all of the information Apple gets from Siri, inclusive of your voice recordings. But the way you do that isn’t by going to the Privacy section of your settings. Instead, do this.

  • Go to “Settings”, “Siri & Search”
  • Turn off all the ways there are to activate Siri. There are two: “Listen for ‘Hey Siri’ and Press Side Button for Siri.
  • When you turn off the last way to activate Siri, that effectively turns Siri off. You’ll get a warning that there is one more step you need to take to delete your data from Apple’s servers.
  • Go to “Settings” > “General” > “Keyboard.” Scroll down to where you see “Enable Dictation.” When you select off, you’ll get a warning that if you ever want to use it again, you’ll have to go through some re-uploading.

Mute the microphone

Mute Amazon Echo:

The simplest way to control what your smart speaker hears is to mute it when you’re not using it.

Press the On/Off button on the device. When this button turns red (or, in the case of newer models, makes the ring at the base glow red), the mic is off.

If a third-party smart speaker uses the Alexa digital assistant, consult your manufacturer’s instructions to find out how to mute your unit.

Mute Google Home:

The Google Home has a mute button, while the Hub Max and Hub Mini have a toggle switch on the side.

Turn off purchasing

Disable personalized features that collect sensitive information. They make it easy to access your calendar, contacts, and other apps-not just for you, but for everyone who is curious. It is safest to disable all features that use personal information.

Know what’s connected to your smart speakers

Be careful when connecting security devices to the voice assistant. Each new connection is a potential weak link in your home's security system. Once smart locks, security cameras, and alarms are connected to the Internet, it becomes much easier to break into them.

Criminals have all started to apply IoT engines like google to locate the default username and password of any tool on your home network. This consists of home protection gadgets, smart speakers, Wi-Fi routers, or even your refrigerator, all of which may be factors of weakness.

Lock down your login

Use two-factor authentication to add another layer of protection against unauthorized device access.

Use a VPN

A virtual private network encrypts your traffic and hides your IP address, mitigating the risk of getting hacked. One NordVPN account can help you protect up to six different devices. If you enable it on your router, you protect the whole household.

Check our suggestions on VPNs

Minding your network

Create another network for IoT devices. They don't need to connect through the home Wi-Fi hotspot you use for personal non-IoT gadgets. If perpetrators have managed to hack your home Wi-Fi, they could intercept conversations with your smart speaker. We recommend setting up a network dedicated solely to your IoT devices and smart home system.

Don’t discuss private matters

It’s best not to talk about your passwords, credit card details, and personal information when your speaker is turned on. Third-party contractors, criminals, or even the geeky neighbor next door could be listening to your conversations.


Conclusion

Improved and more intuitive privacy controls are a demand, laying the ground for more and more research in this area. Being in its early days, this research field requires additional efforts, which are much needed not to only better understand users' relationships with existing smart speakers, e.g., across different cultures, but also provide better ways of helping them in protecting their privacy.

As a result, we conclude that existing privacy controls are under-utilized. Future research should investigate why this is the case and whether other controls would be more useful. Lack of awareness appears to be one major reason.

So to sum it all up, smart speakers may not suit those who worry about where their data is going. But as long as you're aware of the risks and adjust your settings accordingly, they could be a viable option for you and many others, especially as time goes by.

👉 FAQs

  • Can you use a smart speaker without a phone?

Yes and no. You need a smartphone to connect Alexa, Google, or Siri setup. You set it up by setting up a new device option.

  • What's the difference between a smart speaker and a Bluetooth speaker?

Regular Bluetooth speakers are connected just to your phone or computer to play audio and don't do much else. While smart speakers have built-in internet assistants like Alexa, Siri, and Google Assistant.

Author
Hamna Imran
Cyber Security student and keen learner, writing articles for several other websites.

Write a review

click to select