Perplexity AI looks like a search engine. You type a question, you get an answer with sources. Cleaner than Google, more organized, faster. No surprise it is spreading through offices.
The problem is exactly how harmless it feels. Employees treat Perplexity like a better Google search. They type in the same kinds of questions they would type into Google. Except with Perplexity, they start asking more specific questions over time. Questions with context. With client names. With details from active cases.
Because it feels like a search engine, the caution that employees might have with ChatGPT is absent. Most people know by now: ChatGPT is an AI tool, be careful. Perplexity? “I was just searching for something.”
This question keeps showing up in forums and LinkedIn groups: “Isn’t Perplexity basically just a search engine?” No. Perplexity is an AI system that processes inputs, retains context, and generates answers. The fact that it shows sources does not make it Google. It makes it an AI tool with citations. The distinction matters legally.
The Class Action: Data Shared with Google and Meta
In 2026, a class action lawsuit was filed against Perplexity in the US. The allegation: Perplexity shared user chat data with Google and Meta.
Read that again. Not advertising partners in general. Google and Meta. The two companies that monetize more user data than anyone else on the planet.
If these allegations hold, it means every query your employees typed into Perplexity may have ended up with third parties. Without knowledge, without consent, without any contractual basis.
For a firm that handles client data, that alone is reason enough to remove Perplexity from company devices. It does not even need to be proven. The fact that such a lawsuit exists and the data flows are not transparently documented makes GDPR-compliant use practically impossible.
And even if the allegations are eventually dismissed: the fact that Perplexity cannot or will not disclose where data flows remains. Any company that approves an AI tool for employees needs to be able to document the data flows. With Perplexity, that is not currently possible.
No Encryption for Uploaded Files
Perplexity offers an upload feature. Users can upload documents and Perplexity analyzes them. Contracts, PDFs, financial reports. Sounds useful. The problem: these files are not encrypted at rest.
For a tax advisory firm or law firm that works with confidential client documents every day, this is a clear disqualifier. Client data sitting unencrypted on the servers of a US provider, with no clear documentation of who has access or how long files are retained.
This is not just a technical shortcoming. It violates basic GDPR requirements for technical and organizational measures under Article 32.
Picture this: an employee uploads a draft contract to get a summary. Or a tax assessment to clarify a question. That document then sits unencrypted on servers whose security architecture Perplexity does not describe in detail. Who has access, how long it is stored, whether it feeds into training data: all unclear.
No Independent GDPR Verification
Perplexity has not undergone any independent audit for GDPR compliance. There is no public certification, no standard Data Processing Agreement for business use.
For comparison: ChatGPT Enterprise offers a DPA, EU data residency, and a contractual commitment not to use your data for training. Microsoft Copilot has the DPA and the EU Data Boundary. Even Claude offers a DPA for Team and API users.
Perplexity has none of that. No DPA, no EU data residency, no transparent documentation of data processing. For personal use, that may be acceptable. For business use involving personal data, the contractual basis required under Article 28 GDPR simply does not exist.
I cannot stress this enough: without a DPA, there is no legal basis for data processing. It does not matter how well the tool works. It does not matter if the employee “just quickly looked something up.” The DPA is not a formality. It defines what the provider can and cannot do with your data. Without it, you have no control and no legal standing.
I compared the major AI tools and their GDPR readiness in a detailed comparison post. That will show you where Perplexity stands relative to the alternatives.
What to Tell Your Team
The recommendation is straightforward: add Perplexity to the list of non-approved tools in your AI acceptable use policy.
The reasoning matters here. This is not about Perplexity being a bad tool. The opposite is true. Perplexity often delivers better answers than a standard Google search. But the data handling is not transparent enough to use it with client data or business information.
Explain three things to your team:
First, Perplexity is not a harmless search tool. It is an AI system that stores and processes inputs. Everything entered there can end up on servers in the US and potentially be shared with third parties.
Second, there is no Data Processing Agreement. Without a DPA, there is no legal basis for processing personal data. That is not a technicality you can sort out later. Without a DPA, the whole thing is off the table.
Third, better alternatives exist. They cost more than free, yes. But there are AI research tools with a DPA, EU data residency, and documented data handling. The question is not whether your team uses AI for research. The question is which tool.
The Alternative
There are AI tools built for business use. With a DPA, with data residency in the EU, with clear rules for retention and deletion. None of them are automatically GDPR-compliant. But they give you the foundations that make a compliance review possible in the first place. Perplexity does not, as of today.
If you want to assess which AI tools are safe for your firm and which belong on the block list: I offer a free AI check. 30 minutes, concrete assessment, no sales pitch. Or take a look at my AI compliance services.