Google admits to reading your chats with AI, even if you pay! Here's how to stop them now.
Google employees are currently reading your chats and viewing your uploaded images
You're paying at least $20 per month for a Google subscription (Gemini Advanced or Google One with Premium, Plus or Ultra), revealing your deepest fears, business secrets and intimate conversations. You believe that paying for the service means you won't have to pay with your data. That paying customers get privacy. You believe that Google wouldn't dare violate that trust. Every single assumption is dead wrong. While you've been confiding in what you thought was a private AI assistant, Google's human contractors have been reading your chats, and your most intimate conversations have been training their next AI model. You never knew. Millions of paying subscribers still don't! But the good news is that there is a simple solution that will help you protect your chats in the future. The step-by-step guide on how to protect your data from misuse by Google is covered below.
Despite Google's bold "privacy" promises plastered across their marketing materials, even paying customers are having their most sensitive conversations harvested to train the next generation of AI models. While competitors struggled to keep pace, Google closed the gap to frontier AI providers like OpenAI at record speed. This wasn't magic. It was your data.
The Fine Print They Hope And Know You'll Never Read
Buried in Google's Gemini Apps Privacy Notice is a stunning admission: when your "Keep Activity" setting is on (which it is by default), Google uses your conversations "to provide, develop, and improve its services (including training generative AI models)." The document explicitly states they will read your chats. This applies to all users, including paid subscribers.
You don't believe this? Read their official privacy document yourself and watch as the implications sink in: Google Gemini Privacy Document
Here's what Google is actually doing with your chats:
-
Training AI models on your conversations to improve future versions of Gemini
-
Human review by trained reviewers (including third-party contractors) who read your supposedly private conversations
-
Retaining reviewed chats for up to 3 years, even after you delete them from your account
-
Processing under "legitimate interests" as their legal basis, meaning they decided their need to train AI outweighs your privacy
The privacy notice explicitly warns: "Please don't enter confidential information that you wouldn't want a reviewer to see or Google to use to improve our services, including machine-learning technologies." Read that again. Google is telling paying customers not to share anything confidential with their "AI assistant." If that doesn't alarm you, nothing will.
The Impossible Choice: Privacy or Functionality
Google offers one escape route: turn off the "Keep Activity" setting. This theoretically stops Google from using your future chats for AI training (unless you submit feedback). But there's a brutal catch.
With Keep Activity disabled, you lose:
Access to your chat history beyond 72 hours
The ability to switch between previous conversations
Integration with Google Workspace apps (Gmail, Docs, etc.)
The core functionality that makes AI assistants useful
This is by design. Google has engineered a false choice: surrender your privacy or cripple your experience. They're betting most users won't sacrifice functionality for privacy, and those who pay $20/month especially won't tolerate a degraded experience.
Even with Keep Activity disabled, your conversations are still "used to respond to you and help protect Google, our users, and the public, including with help from human reviewers." Translation: Google employees can still read your chats for "safety" purposes.
The AGI Arms Race Running on Your Private Data
Why is Google so aggressive about data collection, even from paying customers? The answer is the race to Artificial General Intelligence (AGI), the holy grail of Big Tech.
Training cutting-edge AI models requires massive amounts of diverse, real-world conversational data. While OpenAI's ChatGPT started with a lead, Google had a secret advantage: 1.8 billion Gmail users, Android device telemetry, Search query data, and now Gemini conversations from both free and paid users.
This explains Google's meteoric improvement. Just a year ago, their AI models were widely considered inferior to ChatGPT and Claude. Now, Gemini competes at the frontier. This transformation wasn't achieved through algorithmic breakthroughs alone. It was fueled by mining the private conversations, fears, hopes, and ideas of millions of users who trusted Google with their most sensitive information.
The Double Revenue Stream
Here's the cynical genius of Google's model: they charge you $20/month for Gemini Advanced while simultaneously using your conversations as free training data. You're paying them to let them train AI on your data. It's a double revenue stream: subscription fees plus invaluable training data that competitors would pay millions to access.
Compare this to how Google plans to use this data in the future: personalized advertising. The privacy notice admits that while "your Gemini Apps chats are not being used to show you ads" currently, they ominously add "If this changes, we will clearly communicate it to you." They're keeping that door wide open.
The Professional Risk
For professionals, the risks are existential. Imagine:
Entrepreneurs brainstorming startup ideas with Gemini, only to have those innovations become part of Google's knowledge base
Therapists or patients discussing treatment strategies, creating training data for AI psychology models
Business executives analyzing competitive strategies, inadvertently sharing trade secrets with Google's AI division
Lawyers drafting legal arguments, potentially compromising client confidentiality
Every conversation becomes intellectual property that Google claims the right to exploit under their "legitimate interests" legal framework.
The solution
The good news? There are 2 solutions. You have a choice.
-
How to opt-out of Gemini Training
If you're currently using Google Gemini:
-
- Go to Gemini Apps Activity settings at myactivity.google.com/product/gemini
-
- Look for the section labeled "Gemini Apps Activity"
-
- Click "Turn off" to stop future training on your conversations (or "Turn off and delete activity" if you want to wipe past history)
-
- Review and delete your chat history (though reviewed chats are retained for 3 years regardless)
The drawback? You won't be able to access your chat history. Once you close a chat, it's gone forever. Of course, it will stay on Google's servers for another 3 days, but not be visible to you.
-
Switch to xPrivo
xPrivo was built on a fundamentally different principle: privacy by design, not privacy by disclaimer.
Here's how xPrivo protects what Google exploits:
-
Zero data retention: Your conversations aren't stored on external servers
-
No training on your data: Your chats will never train AI models
-
No human review: Nobody can read your conversations
-
Anonymous usage: No account required; use xPrivo completely anonymously via web or iOS app
-
Anonymous Pro subscription: Even premium users don't create accounts
-
Local deployment option: Technically skilled users can host xPrivo locally with open-source models for complete data sovereignty
And here's the twist: xPrivo Pro users can even access Gemini models through xPrivo's privacy-protecting infrastructure, ensuring your conversations with Google's AI are never used for training. You get the model's capabilities without sacrificing your privacy.
Head over to xPrivo.com and try it out for free.
Take Action Today
The era of "free" services funded by surveillance is bad enough. But paying for surveillance while being told it's "privacy-first"? That's unacceptable. Make a waise choice today, if you want to keep using Gemini make sure to turn off activity, which means you lose the most important functionality of AI chats. Or just switch to xPrivo for truly private AI assistance where nobody can read your chats or train on your sensitive conversations.
Choose data sovereignty. Choose transparency.
Source of the claims above: Official Gemini Privacy Document