Home of internet privacy

Siri and Alexa train their AI with your voice. Here’s how to delete that data.

If you’ve ever said “Hey” to Alexa, or an “OK” to Google, chances are your voice has been recorded and transcribed to improve their AI voice recognition technology.

With more news stories of contractors listening to your recordings and lawsuits mounting over the non-consensual recording of children’s voices, the privacy implications of having these voice assistants keeping records of your every interaction with them have been met with growing alarm.

The good news: You can find these recordings and clear them. The bad news: These recordings might still remain on the company’s servers.

Learn how to delete your history on:

As virtual assistants proliferate, so do privacy concerns

In a span of just a few years, Alexa, Siri, and Google Assistant have quickly become household names. In the U.S. alone, 66.4 million own at least one smart speaker, with Amazon’s Echo taking the majority share of the U.S. and global market.

We’ve seen the use cases for these products expanding, especially with Amazon, which has introduced a supposedly kid-friendly Echo Dot and just launched a partnership with the UK’s National Health Service to automatically search the NHS website for UK users asking for health-related advice.

Despite their growing presence in everyday life, the virtual voice assistants have been met with some apprehension and suspicion over how much they are recording and being listened to. An ExpressVPN survey found that over half of respondents say they’re either extremely or very concerned that data from their smart home devices could be used to piece together personal habits. However, while 24.4% of all respondents always turned off their microphone on smart home devices when they weren’t using them, 21.2% never did.

Eavesdropping is a feature, not a bug

In order for these voice assistants to pick up the wake word (“OK, Google” or “Hey Siri”), these voice assistants are listening to you all the time, even if they may not be recording it.

Well, actually, they might still accidentally record you, if they think they’ve heard the wake word.

A Bloomberg report found that Amazon was using humans to transcribe voice recordings from Alexa, some of which were accidental. Workers on the team transcribing these recordings told Bloomberg that not only were they listening to hundreds of recordings in their shift, but that these recordings also came with the customer’s first name, their device’s serial number, and an account number. Some of the recordings appeared accidental, with recordings including what seemed to be a woman singing in the shower, a screaming child, and a sexual assault.

Google Assistant is not off the hook here either—Belgian news agency VRT acquired more than a thousand recordings from a contractor who was tasked with transcribing them, “153 of which were conversations that should never have been recorded and during which the command ‘Okay Google’ was clearly not given.” While Google decouples the information from its source by replacing the user name with an anonymous serial number, VRT’s investigation found that “it doesn’t take a rocket scientist to recover someone’s identity” based on what was being said.

A team also reviews the voice recordings on Apple’s Siri, and as with Google Assistant, those listeners strip the personally identifiable information and replace it with a random ID number, which resets every time Siri is switched off.

The fact of the matter remains, however, that your voice assistants are flawed enough to pick up false positives of their wake words, potentially recording private conversations not meant for others.

“That’s the scary thing: There is a microphone in your house, and you do not have final control over when it gets activated,” Dr. Jeremy Gillula from the Electronic Frontier Foundation told Gizmodo. “From my perspective, that’s problematic from a privacy point of view.”

How to delete your voice history on Alexa, Google, and Siri

If you don’t want to consider the nuclear option (i.e., chucking your Echo out the house), there are ways to remove these recordings from your assistant and turn off its voice recognition when you don’t need it. Here’s how.

How to delete history on Alexa

To select and delete previous voice recordings on your Alexa:

  1. Open the Alexa app on your phone.
  2. Tap the menu icon on the top-left corner.
  3. Tap Settings.
  4. Tap Alexa Account at the top of the page.
  5. Select History and delete the recordings you don’t want on your device.

If you want to delete everything at once:

  1. Go to Amazon’s Device page.
  2. Select the menu button to the left of the Echo device you’d like to manage.
  3. Select Manage Voice Recordings.
  4. Select Delete.

Be warned, though, that even wiping the audio files is not a guarantee of their complete removal from Amazon’s servers even when you wipe the audio files from your account

If you want to go a step further and stop Amazon from using your voice to train its AI:

How to delete history on Google Assistant

To manage and delete voice and audio activity on Google Assistant:

  1. Go to your Google Account.
  2. On the left navigation panel, select Data & personalization.
  3. In the Activity controls panel, select Voice & Audio Activity.
  4. Click Manage Activity to see the list of all recordings.
  5. Next to the item you want to delete, select the button and select Delete.
  6. To delete all your activity on Google Assistant, select Delete activity by instead and select All time. At the bottom, select Delete.

To turn off Google Assistant completely, follow Google’s instructions for your computer, Android device, or iPad and iPhone.

How to delete history on Siri

Siri doesn’t let you hear or manage your voice interaction history. To delete this history requires you to turn off both Siri and Dictation.