At least one in five homes in the UK now contains a smart speaker, according to a government report. But do they have a dark side? We dug deeper to find out.
Many people worry that voice assistants are used for surveillance, while another major concern is that Amazon and Google will use their voice assistants for marketing purposes.
Targeted advertising widely takes place as a result of the things we search for online. But now smart speakers can listen to the everyday conversations in the heart of our homes - giving the potential for companies to take targeted advertising to a whole new level. The thought of what these companies could know out about you by listening in is, naturally, a scary one.
We put both leading voice assistants, and the companies behind them, to the test. We wanted to see what's actually being recorded, how the recordings are being used, and how easy it is to control your privacy.
We spoke to Dr Hamed Haddadi at Imperial College London to get his take on voice assistants.
We gave six volunteers an Alexa or Google Assistant smart speaker to take home, and some scripted questions to ask. After that, they could use the devices more naturally, however they liked. We then asked them to find out what information was stored about them, including contacting Amazon and Google with a 'subject access request' (SAR), which is one of your rights under data protection law to find out what personal data companies hold about you.
Each volunteer used one of the following smart speakers or smart displays:
You can see some of the scripted questions we gave the volunteers to ask Alexa and Google Assistant in the graphics. Each question was designed to test an aspect of how the voice assistants work. For example, can Alexa reliably tell if someone says 'Alex' or 'Alexa'? Do they keep recording if you talk to your kids straight after using them? How careful do you need to be with saying sensitive personal information such as bank details and passwords?
In our snapshot test, both voice assistants recorded more conversations than they should have done. Voice assistants are only supposed to listen to you after you've said a 'wake word' - this is 'Alexa' on Alexa, and 'OK Google' or 'Hey, Google' on Google Assistant. However, all three of our Google Assistant volunteers found that phrases were transcribed without a 'wake word' being said.
One of our Alexa volunteers noted that his voice recordings, when played back in the Alexa app, sometimes included words stated immediately before the wake word that weren't intended for the voice assistant. Google said that this is the case for its voice assistant, too.
What's more, Alexa and Google Assistant might continue recording unless you pause, even if you start talking to someone else; a brief pause failed to prevent the speaker from listening beyond the initial command in some instances.
We also tested for accidental recordings, for example if you change your mind. When volunteers asked 'Alexa? No don't worry', the devices kept recording, as did Google's. Amazon and Google say they aim to minimise unintended recordings. But our snapshot test suggests that devices could be recording things you might not want them to, accidentally or otherwise, including information you'd want to keep private.
For those worried about smart speakers being used as a means of surveillance, this is not entirely true. Google says around 0.2% of audio recordings are reviewed by humans, which could amount to 2,000 reviewed recordings per million captured. Amazon says 'a small fraction of one percent' of Alexa recordings are reviewed by humans. Both say this is done to help improve their voice assistants.
Humans reviewing and annotating voice data is standard industry practice that helps to improve the artificial intelligence behind voice assistants, so they can provide more helpful responses to queries and fix errors.
Amazon explained to us: 'Access to data annotation tools is only granted to a limited number of employees who require them to improve the service, and our annotation process does not associate voice recordings with any customer-identifiable information. Customers can opt out of having their voice recordings included in the fraction of one percent of voice recordings that get reviewed.'
Google said : 'As part of our work to develop speech technology for more languages, we partner with language experts around the world who understand the nuances and accents of a specific language. These language experts review and transcribe a small set of queries to help us better understand those languages. This is a critical part of the process of building speech technology, and is necessary to creating products like the Google Assistant.'
The other fear is voice assistants being used for personalised advertising. As we found in our snapshot test that voice assistants pick up personal information unintended for them, this could potentially be used for targeted advertising.
But think about what Amazon and Google already know about you. Whether that's what you've bought on Amazon, searched for on Google, or common journeys you've made that are tracked by Google Maps, the technology giants have likely already built up a rich personal profile of you.
By combining voice-assistant data with data they have from these services, the tech giants have the potential to take personal profiling of individuals for advertising purposes deeper than they ever have before.
More and more products, from TVs to cars, have voice assistants built in, so it's likely you'll end up with products that include these features, if you don't already.
Amazon said: 'We do not use customers' voice recordings for advertisements.' But Google said it treats Google Assistant 'similarly to searching on Google, and may use these interactions to deliver more useful ads'.
Fortunately, you have the power to control your privacy here. Firstly, you can opt out of voice recordings being stored and potentially reviewed by humans. Secondly, you can opt out of Google ad personalisation completely in your Google account settings. So while Amazon and Google might be spying on you, you have the power to control it.
Our investigation also raises concerns about transparency. While Amazon and Google are transparent with the information they provide, and our Google Assistant volunteers were able to easily find the transcripts of their voice commands in their Google accounts online, when they tried to contact Google to find out if it stores any additional data, all struggled to find a suitable way to do this. Google since told us the form to request data can be found in the , via a 'contact us' link in the introduction to its .
We still think Google could provide this key information in a more upfront way. In our snapshot test, none of our volunteers found this form. Our volunteers found Amazon easier to contact, but also ran into issues. Two who called were put through to staff who did not appear to know how to respond to a subject access request (SAR) - one of your rights under data protection law.
After first trying to phone Amazon, one of these volunteers managed to track down its SAR form by emailing Amazon Customer Services, who provided a link to it. This form gave her a complete download of all personal information to do with Alexa that Amazon had stored. This included full transcripts, voice recordings, settings, interactions with smart devices she'd connected Alexa to, the name you give each of these connected devices (for example, 'Adam's iPhone'), and apps she'd used. Both Amazon and Google should make it clearer upfront how to find these forms.
Amazon said: 'At Amazon, customer trust is at the centre of everything we do and we take customer privacy very seriously. From the beginning, we've built privacy deeply into the Alexa service and always look for ways to make it even easier for customers to have transparency and control over their Alexa experience.
'We've introduced several privacy improvements, including the option to have voice recordings automatically deleted after three or 18 months on an ongoing basis, the ability to ask Alexa to u201cdelete what I just saidu201d and u201cdelete what I said today,u201d and the Alexa Privacy Hub, a resource available globally that is dedicated to helping customers learn more about our approach to privacy and the controls they have. We'll continue to invent more privacy features on behalf of customers.'
Google said: 'We believe you should always be able to manage your data in a way that works best for you. By default, we don't retain your audio recordings, and you can change your settings at any time. When we detect unintended activations, we delete the data even when users have opted in to save their audio recordings.'
Google added that it has 'automated systems to detect and remove sensitive details'.
You can view your voice assistant activity in your Google or Amazon account:
You should assume that what you say immediately after the wake word when using your voice assistant will be stored. This means complete transcripts of every conversation you have with it, details of all apps you use, and records of any interactions with smart home products, such as turning on or off.
Here are some ways you can control what your voice assistant does and manage your privacy:
Most smart speakers have a physical mute button you can press to be sure they won't disturb you or listen in. Amazon Echo speakers, for example, will clearly display a red ring when muted, so you can easily tell at a glance they're not listening.
For buying things from Amazon through Alexa, be sure to set up a Pin to prevent anyone else in earshot of your smart speaker being able to order items accidentally or without your knowledge.
Google's Voice Match and Alexa Voice Profiles allow you to train the voice assistant to tell who's using it at any particular time. This helps to keep your data personal to you with devices used by multiple people in your household.
Google and Amazon give you options when you sign up to use their voice assistants - you don't have to just accept everything. For example, Google Assistant currently doesn't store voice and audio recordings unless you opt in, so it won't record your actual voice commands (beyond just a text transcript of them), unless you agree for it to do so.
Smart speakers are connected to your home wi-fi and the internet. We conduct full privacy and security screening on all the wi-fi-connected speakers (and their apps) we test to check for issues that would raise concern. We will alert you in our reviews - and encourage the manufacturer to address the issue - if we find any vulnerability. If we find serious privacy flaws in a product, we will make the product a Which? Don't Buy to avoid.
Some smart speakers restrict which functions their voice assistant can control. For example, the (with Alexa and Siri) and (with Alexa) limit voice commands to music controls only. Click through the links to our reviews to see whether this could be the right solution for you.
Kids often love to use voice assistants. Both Alexa and Google Assistant have various parental controls, which can restrict the content your children are able to access. Alexa also has separate Kid Skills you can enable, so they can use the services that are most suitable to them.
The voice data most people will be looking for is available for you to access - and delete - in your Amazon or Google account. However, not absolutely everything is available to view through the tools Amazon and Google provide in your account. To be extremely thorough, you'd need to use Amazon and Google's subject access request forms, which will give you additional snippets of data in addition to what's accessible in your account. Here you're accessing powerful ways for you to exercise your rights through personal data laws.