It’s happened to most of us: we’re having a conversation with a friend about those shoes we really like or that supplement we were thinking about trying. Later that night while we’re scrolling Instagram, what shows up? An advertisement for that very same thing. It’s almost like… someone was listening to us. Turns out, someone is – and her name is Siri. Apple just settled a $95 million dollar lawsuit at the end of last month, addressing privacy concerns over Siri’s eavesdropping habits.

The claim

Multiple owners of Apple devices claimed that they had been routinely recorded by Siri while having private conversations, despite not having activated the voice assistant. They also claimed that the data from these conversations was then sold to third-party marketers, who used it for targeted advertising. According to Reuters,

“Two plaintiffs said their mentions of Air Jordan sneakers and Olive Garden restaurants triggered ads for those products. Another said he got ads for a brand name surgical treatment after discussing it, he thought privately, with his doctor.”

Apple claims to have settled the lawsuit in order to “avoid further litigation,” but denies that any personal data was recorded or sold to third-party marketer. In a statement released following the settlement, Apple claims,

“[We have] never used Siri data to build marketing profiles, never made it available for advertising, and never sold it to anyone for any purpose… Apple does not retain audio recordings of Siri interactions unless users explicitly opt in to help improve Siri, and even then, the recordings are used solely for that purpose.”

The fallout

So, what does this mean for consumers? If you owned an Apple device on which Siri was activated between September 17, 2014 (when the “Hey Siri” feature was first activated) and Dec. 31, 2024, you may be entitled to up to $20 compensation per device.

In the wake of the settlement, Apple states that, though Siri has always been secure, they will continue to take action to make consumers feel safe using the voice assistant technology.

“Siri has been engineered to protect user privacy from the beginning… We use Siri data to improve Siri, and we are constantly developing technologies to make Siri even more private.”

Google is also facing a similar lawsuit over privacy concerns with their voice assistant technology in a San Jose federal court.

How to protect yourself

When enabled, virtual assistants standby, waiting for codewords to be activated, so activated phones are always passively listening to conversations. Depending on the brand of the phone, that data may or may not be stored and/or used for marketing purpose. Apple claims to not store any data collected by Siri, in the background or during conversations, but Google does admit to using data taken in assistant conversations for targeted advertising and building profiles for third-party marketers. So, what can consumers do to protect their privacy?

First, if marketing-driven data tracking makes you uncomfortable, disable your phone’s virtual assistant. If you do use virtual assistants, make sure to delete your voice request history to protect yourself in case of cyberattacks and data leaks. Keeping software updated and using a VPN to encrypt your data are also ways to protect yourself against unwanted parties getting private information through Siri.

You should also check your privacy settings to make sure you are only sharing your microphone with apps you trust. Remember, Siri isn’t the only one who listens. Apps like Snapchat also listen and use the data collected for marketing purposes if you have given them access to your microphone.

You can also cover your phone’s microphone with tape or use a mic-lock or microphone-blocking phone case to minimize voice transmission. Just make sure to take it off before calling someone!