Half of all companies in Germany now use AI tools. Bitkom reported 50 percent adoption in 2024. ChatGPT, Copilot, Gemini, Claude, Perplexity. The tools are different. The GDPR requirements are not.
The GDPR does not care which tool your employees prefer. It asks: Where does the data go? Who processes it? Is there a Data Processing Agreement? Are your inputs used for training?
That is exactly where these platforms diverge. Not in the quality of their answers. In whether you can legally use them for business at all.
The Comparison Table
Here is the overview. I evaluated the most common AI tools against four criteria that matter for GDPR-compliant use: training usage, DPA availability, EU data residency, and basic suitability for business.
| Tool | Tier | Data used for training? | DPA available? | EU data residency? | GDPR-compliant use possible? |
|---|---|---|---|---|---|
| ChatGPT | Free / Plus | Yes (default) | No | No | Not for business use |
| ChatGPT | Team | No | Yes | No (US) | Conditional, with limitations |
| ChatGPT | Enterprise | No | Yes | Yes (EU since Feb 2025) | Yes, with configuration |
| Copilot | Bing / personal | Unclear | No | No | Not for business use |
| Copilot | for Microsoft 365 | No | Yes (Microsoft DPA) | Yes (EU Data Boundary, but Anthropic exception since Jan 2026) | Yes, with DPIA + configuration |
| Claude | Free / Pro | Yes (Free), Opt-out (Pro) | No standard DPA | No | Not for business use |
| Claude | Team / API | No | Yes | Partial | Conditional |
| Gemini | Consumer | Yes (prompts stored 18 months, 72h even when “off”) | No | No | Not for business use |
| Gemini | Workspace Enterprise | No | Yes | Yes | Conditional |
| Perplexity | All tiers | Unclear (class action for sharing data with Google/Meta) | No | No | Not recommended |
| DeepSeek | All tiers | Yes | No | No (China) | Absolutely not |
As of April 2026. Providers change terms frequently. Verify current DPA availability directly with each vendor.
This table is the starting point. But a table alone is not enough to make the right decisions. The details matter.
ChatGPT: The Biggest Gap Is Not the Technology
ChatGPT is by far the most widely used AI tool. Bitkom data shows 43 percent of personal users rely on ChatGPT. Business usage follows a similar pattern. The problem: most people are on the wrong tier.
Free and Plus: No DPA, No Business Use
The free version and ChatGPT Plus ($20/month) do not offer a Data Processing Agreement. Without a DPA, you are missing the contractual basis required by GDPR Art. 28. That is not a recommendation. It is law.
On top of that, OpenAI uses inputs from the free version to train its models by default. Plus lets you toggle that off. But a toggle is not a contract.
In practice it looks like this: An employee opens ChatGPT Plus on a work laptop, pastes in a client inquiry, and gets a useful summary back. Technically, everything worked. Legally, a GDPR violation just happened.
I wrote a detailed breakdown in Is ChatGPT GDPR Compliant?, including the full compliance checklist for businesses.
Team: The Entry Point, with Caveats
Starting at the Team tier ($30/user/month), OpenAI provides a DPA. Model training with your data is disabled by default. Those are the two baseline requirements.
What is missing: EU data residency. Your inputs are processed in the US. That is not an automatic dealbreaker if the Standard Contractual Clauses (SCCs) in the DPA are properly structured and you have completed a Transfer Impact Assessment (TIA). But it adds compliance overhead that many companies underestimate.
For a small firm with five employees using ChatGPT for internal research (no client data in prompts), the Team tier can work. Once personal data enters the picture, it gets tight.
Enterprise: The Highest Standard at OpenAI
Since February 2025, OpenAI has offered EU Data Residency. Data is stored in European data centers and, since January 2026, processed there as well (in-region GPU inference). This is only available for Enterprise, Edu, and the API platform.
ChatGPT Enterprise covers what the GDPR requires: DPA, no training, EU data residency. Admin controls, SSO integration, and configurable retention periods are included too.
But. Enterprise is not automatically compliant. You need to configure it. Set the retention periods. Set up SSO. Structure the permissions. The tool provides the prerequisites. You provide the compliance.
The Gap That Exists in Practice
The distance between “my assistant uses ChatGPT Plus” and “we have ChatGPT Enterprise with a DPA and EU data residency” is enormous. Most law firms and tax offices I see in my network have the first. Not the second.
It is not about ignorance. Nobody asked the question. The employee found a useful tool. Management knows about it, says nothing against it. And nobody checks whether the license matches the compliance requirements.
The Italian data protection authority fined OpenAI 15 million euros in December 2024. Among the reasons: lack of legal basis for processing training data. The risk is real.
Microsoft Copilot: Not an External Tool, It Is Inside Your Environment
Copilot for Microsoft 365 is a different conversation than ChatGPT. ChatGPT is an external tool employees open in a browser. Copilot sits inside Word, Excel, Outlook, Teams, and SharePoint. It has access to everything the user has access to.
That makes it powerful. And it makes the GDPR requirements more concrete than with any other AI tool.
Microsoft Is Forcing Adoption
On April 15, 2026, Microsoft removed free Copilot Chat from Office apps. If you want Copilot in Microsoft 365, you now need a paid license. Microsoft is serious about this.
That means companies that were casually using Copilot now face a decision. Either buy a Copilot license or deliberately turn it off. Both require a compliance assessment.
What Works in Copilot’s Favor
Copilot for Microsoft 365 has some real advantages. The Microsoft DPA (Data Processing Addendum) is the Data Processing Agreement. It is available for all Business and Enterprise licenses. Microsoft introduced the EU Data Boundary, which keeps data within the EU.
No training: Microsoft does not use your data to train models. That is contractually guaranteed.
Where It Gets Complicated
The DSK (the conference of German data protection authorities) reviewed Copilot for Microsoft 365 and rated its transparency as insufficient. That is not a ban, but it is a clear signal.
Since January 2026, Anthropic (the company behind Claude) is listed as a subprocessor for Microsoft. This means that when you use Copilot, certain requests may be routed to an Anthropic model. And Anthropic is explicitly excluded from the EU Data Boundary.
Read that again: Part of your Copilot data can leave the EU. Not because Microsoft wanted it that way, but because the subprocessor does not offer EU infrastructure.
Then there is oversharing. Microsoft itself names it as the number one risk with Copilot deployments. Copilot searches everything a user has access to. If your SharePoint permissions are not clean (and at most companies, they are not), Copilot surfaces files the user was never supposed to see.
I covered the five most important GDPR questions before a Copilot rollout in a separate post. That includes the DPIA requirement, Sensitivity Labels, and Section 203 of the German Criminal Code.
What You Need Before Activation
Before Copilot goes live in your organization, you need:
- A DPIA under GDPR Art. 35. AI-powered processing of personal data at this scale requires one. Full stop.
- A SharePoint permissions audit. Who has access to what? Before Copilot goes live, not after.
- Sensitivity Labels. Confidential documents need to be classified so Copilot handles them accordingly.
- An assessment of the subprocessor chain. Anthropic, OpenAI, and whoever gets added next. Document which data flows to which subprocessors.
That is work. But it is doable. A realistic timeline for a firm with 15 to 30 employees: two to four weeks if you approach it systematically.
Google Gemini: The 72-Hour Trap
Gemini is Google’s AI assistant. Bitkom data puts usage at 22 to 28 percent. The consumer version is free, built into Google accounts. And that is precisely the problem.
Consumer Version: More Storage Than Expected
Google stores prompts from the consumer version for 18 months by default. That alone is difficult. But it gets more specific: Even when you turn storage “off” in settings, Google retains prompts for 72 hours. During those 72 hours, human reviewers can view your inputs.
For business use with personal data, that is not workable. No DPA, no controllable retention, human reviewers with access to your prompts.
Gemini in Google Workspace Enterprise
The Enterprise version looks different. Google provides a DPA, commits contractually to not using Workspace data for training, and offers EU data residency.
Still: Most companies using Gemini are not on the Enterprise version. They are using the consumer version through a personal Google account. On a work laptop. During business hours. With client data in the prompt.
Claude, Perplexity, DeepSeek: What the Table Shows
These three tools barely register in Bitkom usage data (under 2 percent each in the German market). They are still relevant because they get attention in the press.
Claude by Anthropic is technically strong. For GDPR purposes, what matters more right now: Anthropic has been a Microsoft subprocessor since January 2026. If your employees use Copilot, data may flow to Anthropic without you directly using Claude at all. The Team and API tiers offer a DPA and do not train on your data, but EU data residency is only partially available.
Perplexity has an active class action lawsuit alleging that user data was shared with Google and Meta. No DPA available, no EU data residency. Not recommended for business use involving personal data.
DeepSeek stores all data in China. No DPA that meets European standards. No adequacy decision for China. No transfer mechanism that satisfies GDPR requirements. Stay away.
Which Platforms Can Be Made GDPR-Compliant?
The short answer: Four platforms can, in principle, be configured for GDPR-compliant use.
- ChatGPT Enterprise (with EU Data Residency and DPA)
- Copilot for Microsoft 365 (with DPIA, permissions audit, and subprocessor assessment)
- Claude Team / API (with DPA, but limited EU data residency)
- Gemini Workspace Enterprise (with DPA and EU data residency)
Four out of eleven variants in the table above. And none of them are compliant out of the box.
Here is what many people miss: No AI tool is compliant by itself. Compliance comes from how you deploy it. The license, the configuration, your internal policies, the DPIA, the documentation.
A tool on the “yes” side of the table means you can use it compliantly. Whether you actually do is a different question.
What You Should Do Now
Before you pick an AI tool or keep using the one you have, there are a few things to sort out.
Start with an inventory. What are your employees already using? In most companies, the answer is not “nothing.” The answer is “ChatGPT Free in the browser, maybe Copilot, and nobody asked whether that is okay.”
Then figure out what data actually flows into these tools. Internal research without personal data is less critical than client inquiries with names, addresses, and tax IDs. The use case determines the requirements.
And finally: does the license match what you are doing? If personal data is being processed, you need a DPA. In a regulated industry (law firm, tax office, financial advisory), you also need a DPIA and most likely EU data residency.
None of this is a massive project. But it will not happen on its own.
Not sure which AI tool is right for your business? I offer a free 30-minute initial consultation where we review your current situation. No sales pitch, just an honest assessment of where you stand and what needs to happen.