Google, Amazon, Apple, and Microsoft keep recordings of your smart speaker voice commands on their servers. Here’s how to delete them.
When you summon a digital assistant from one of the devices they reside in, your voice command is uploaded to the company’s server and saved to your account. Whether you’re using a smart speaker or your phone, here’s how to delete your voice recordings from Amazon, Google, Microsoft, and Apple servers to help protect your privacy.
Delete Alexa Voice Recordings
Here I’m using the app on my Android phone, but you can use the Alexa site, too. Launch the Alexa app on your mobile device and tap Settings and then scroll down and tap History. From there, you can delete individual recordings. However, if you want to get rid of them all, you need to sign in to your Amazon account on the site, choose your Alexa device, and delete all recordings. For full details, read: How to Delete All Alexa Voice Recording History.
It’s also worth mentioning that if you have a Fire TV and use the Alexa remote to do voice searches, you can delete those as well. Full details can be found in our article: How to Delete Amazon Fire TV Remote Voice Recordings.
Delete Google Assistant Voice Recordings
To get rid of the voice recordings you speak to your Google Home, head to the Google Voice Activity page on your PC or phone. There you will find a list of everything you’ve used Google Assistant for – that includes the speaker and your phone. Here you can play them back just like with Alexa or delete them one by one or all in one fell swoop. Read our article: How to Delete Your Google Voice History for more details.
Delete Cortana Voice Recordings
For Cortana voice history, head to the Microsoft Privacy Dashboard Voice History section and sign in if you aren’t already. There you will find a list of your voice recordings. I’ve found this to be the most interesting one as Cortana on the Harman Kardon Invoke goes off prematurely far more than any other speakers. With Cortana on Windows 10, you can also delete other search items and limit how much information is collected. For more on that, read our article: Erase Your Search Content from Cortana from Windows 10.
Apple’s Siri Voice Recordings
Siri handles your voice interaction history in a much less user-friendly way compared to the other assistants. There isn’t a way to view and listen back to your voice recordings or delete individual ones. You need to turn off Siri for active listening. To delete your past voice interaction history from Apple servers, you need to turn off Voice Dictation, too. To do that, go to Settings > General > Keyboard and swipe down, and turn off the “Enable Dictation” switch.
While these devices should only record what you say after you trigger its wake word, that’s not always the case. As you know, if a smart speaker is within listening distance of your TV or other audio sources (including a person), they wake up and start recording even when you don’t want it to. When I went through my archive of recordings, I found some interesting things. Sometimes the recording was up to 30 seconds of audio from a movie or podcast. Also, when other people are in the room, if a speaker is triggered, it will record what everyone is saying. When I made a phone call with Alexa, things said on my end were recorded. You can probably conclude what the privacy issue is here.
There might be times when you don’t want the assistants on the speakers always listening for the wake word. The good news is you can also turn that feature off. For a full rundown of how to do it, please read our article: How to Stop Active Listening from Google Assistant, Cortana, Siri, and Alexa.
Since Amazon was the first to get into this type of market, you might want to read a couple more tips on how to help stop it from waking up by changing the wake word as well as preventing accidental purchases.
How do you feel about these smart devices keeping recordings of what you say to them? Please leave your thoughts on it in the comment section below. And I will be using some of your comments in a future article about the privacy issue.