Perplexity Sued: Incognito Mode Allegedly Shared User Data With Meta and Google
A class action lawsuit claims Perplexity's incognito mode was a sham — embedding Meta and Google trackers even for paid users who expected privacy.

{{YOUTUBE:i7M9rdzBUmI}}
The Promise vs. The Reality
Perplexity AI built its reputation on being the "thinking person's search engine" — a clean, ad-free alternative to Google that respects user privacy. Its headline feature for subscribers? Incognito mode, designed to keep conversations private.
A proposed class action lawsuit filed this week says that never actually worked.
According to the complaint, first reported by Ars Technica, Perplexity "effectively planted a bug" on users' computers by embedding trackers from Meta and Google directly inside its AI search engine. Even users who paid for the service and explicitly turned on incognito mode still had their conversations shared with Meta and Google — along with email addresses and other personally identifiable information.
How the Tracking Works
The mechanics are straightforward and damning. When you search on Perplexity, the platform fires requests to third-party tracking pixels operated by Meta and Google. These aren't hidden in obscure corners — they're embedded in the page itself.
The trackers reportedly collect:
- Conversation content — the actual search queries and AI responses
- Email addresses — tied to user accounts
- Personal identifiers — data that allows Meta and Google to individually identify users
- Session data — how long users interact, what they research
And here's the key: all of this happened regardless of whether the user had enabled incognito mode. For a product that charges users specifically for enhanced privacy, that's not a small oversight — it's a fundamental misrepresentation.
Why This Matters for AI Search
Perplexity isn't alone in the tracking game — Google and Microsoft have both faced scrutiny over similar practices. But Perplexity positioned itself as the antidote: a search engine that doesn't sell your data.
The lawsuit challenges exactly that positioning. If Perplexity was sharing data with Meta and Google all along while charging users for "privacy," it undermines the core value proposition of AI-powered search alternatives.
For the broader AI search industry, this case could set a precedent. The AI answer engine market is heating up — Google rolled out AI Overviews, Microsoft integrated Copilot into Bing, and dozens of startups are building conversational search tools. If courts start treating AI-generated conversations as personal data subject to privacy protections, the entire industry needs to rethink how it handles user information.
What's Next
The class action is still in early stages, but it's part of a broader reckoning around AI privacy. Key context from this week's news cycle:
- The Writers Guild just negotiated a four-year deal with studios that includes new protections against AI overreach
- Human creators are pushing for an "AI-free" content label, though no consensus exists yet
- Mercor, the $10 billion AI training data company, confirmed a massive breach exposing 4TB of candidate data
The pattern is clear: users and creators are demanding transparency about how AI companies handle data, and the tolerance for hidden data collection is near zero.
Perplexity has not yet publicly responded to the lawsuit. But the company's next move — whether settlements, policy changes, or technical overhauls — will signal how seriously AI companies take privacy claims when regulators and users come knocking.
For anyone using AI search tools right now, the takeaway is simple: read the privacy policy, understand what trackers are on the page, and don't assume "incognito" means what you think it means.
Sources: Ars Technica, The Verge, TechStartups



