Can Tax Advisors in Germany Use ChatGPT With Client Data?

Can German tax firms legally use ChatGPT with client data? What you need to know about GDPR, professional secrecy, and which AI license actually keeps you compliant.

Short answer: yes. But not with every license, not with every setup, and not without preparation.

The question keeps coming up in industry forums and on LinkedIn. Tax advisors in Germany know AI tools save time. They also know that client data (Mandantendaten) sits under stricter legal protection than ordinary business data. So nothing happens.

It doesn’t have to stay that way. There is a path to using ChatGPT and Microsoft Copilot in a tax practice. It requires the right license, the right configuration, and a few documented decisions.

What Section 203 StGB Means for Your Practice

Steuerberater in Germany are subject to criminal secrecy obligations under Section 203 of the German Criminal Code. This goes beyond GDPR. While GDPR carries administrative fines, Section 203 carries potential imprisonment up to one year or a criminal fine.

In practice: if client data flows into an AI tool without adequate safeguards, the firm risks more than a regulatory penalty. It risks a criminal complaint.

This applies to any form of input. Client names in prompts. Tax assessments uploaded for analysis. Contract text pasted for summarization. The moment personal client data leaves the firm’s controlled systems, both Section 203 and GDPR apply simultaneously.

Which ChatGPT License Works?

OpenAI offers the Data Processing Agreement (DPA) required by GDPR Article 28 only for certain license tiers:

  • Free, Plus, Pro: No DPA. Not suitable for business use with personal data.
  • Team: DPA available. No training on your data. Entry-level option for firms. Data processed in the US.
  • Enterprise: DPA available. EU data residency option. No training. Highest compliance standard.

Without a DPA, the contractual basis is missing. This is true even if you toggle off training in the settings. The toggle is a technical feature. The DPA is the legal requirement.

For tax practices handling client data, the Team license is the minimum. Anything below that is personal use.

The Data Residency Detail Most Firms Miss

Even with a Team license and a DPA, there is a data residency question. Team and Business tiers process data in the US. That means Standard Contractual Clauses (SCCs) apply for the cross-border transfer, and your firm needs a Transfer Impact Assessment (TIA) documenting why you consider the transfer adequate given the risks.

OpenAI’s DPA includes SCCs, but the TIA is your responsibility as the controller. It does not need to be a fifty-page document. It needs to document: what data transfers to the US, what legal protections apply there, what technical safeguards are in place, and why you’ve concluded the risk is acceptable.

For Enterprise tier, OpenAI offers EU data residency with in-region GPU inference. Prompts and completions stay within the EU. If your practice handles high volumes of client data or processes special categories under GDPR Article 9, Enterprise is worth the cost difference. The compliance posture is substantially stronger.

What a Section 203-Compliant ChatGPT Setup Looks Like

Meeting the GDPR requirements is necessary but not sufficient. Section 203 adds requirements that go beyond what a DPA covers:

  • Contractual secrecy obligation. Under Section 203 paragraph 3 StGB (the 2017 reform), firms may share data with “other contributors” if the provider is contractually bound to secrecy. Your DPA alone may not include this clause. You may need a supplementary agreement explicitly referencing Section 203.
  • Notification of criminal consequences. The service provider must be informed that a violation of confidentiality carries criminal penalties under German law. This is not standard language in a US company’s DPA.
  • Necessity requirement. Data sharing must be necessary for delivering the service. “Convenient” is not “necessary.” Document why AI processing of specific data types is required for your practice operations.
  • Ongoing oversight. Your firm must maintain oversight of the provider’s handling of the data. In practice, this means periodic reviews of OpenAI’s terms, subprocessor lists, and data handling practices.

The gap between GDPR compliance and Section 203 compliance is real. Most international AI providers address GDPR because it is a global standard. Almost none address Section 203 because German professional secrecy is a legal concept that does not exist in most other jurisdictions.

What Does Microsoft Copilot Change?

Many tax firms already use Microsoft 365. Copilot licenses are increasingly being offered. The temptation is to simply activate them.

Copilot sits inside your M365 environment with access to SharePoint, OneDrive, email, and Teams. That’s a different risk profile than an external tool like ChatGPT. Copilot searches everything a user can access. If client files sit in SharePoint with loose permissions, those files will appear in Copilot responses.

Three things need to be in place before activation:

  1. Sensitivity labels: Client data must be classified as confidential. Without labels, Copilot treats every document identically.
  2. Permissions audit: Who has access to which SharePoint folders? Most firms haven’t reviewed this in years.
  3. DPIA: GDPR Article 35 requires a data protection impact assessment for AI-driven processing. Microsoft provides a template but explicitly states the controller must complete their own.

Do I Need a DPIA?

Yes. GDPR Article 35 requires a data protection impact assessment when processing is likely to pose a high risk. AI-driven processing of client data meets that threshold.

The EDPB’s ChatGPT Taskforce report (May 2024) made clear that firms cannot shift compliance responsibility to OpenAI or Microsoft through terms and conditions. The firm remains the controller.

A DPIA for a tax practice covers:

  • Which data flows through the AI tool
  • What risks exist for clients (disclosure, unauthorized access, third-country transfers)
  • What safeguards mitigate those risks
  • Whether the data protection officer was consulted

Your tax authority or financial regulator can request this document at any time. If it doesn’t exist, there’s no compliance record.

How to Write a DPIA That Actually Works

A DPIA does not need to be a legal masterpiece. It needs to be accurate, complete, and honest about residual risk. Here is a practical structure for a tax practice:

Section 1: Description of processing. State what the AI tool does. Example: “ChatGPT Team is used by three staff members to draft client correspondence, summarize tax assessments, and generate internal research notes. Client-identifying data is redacted before input where possible.”

Section 2: Necessity and proportionality. Why is this processing necessary? What legitimate interest does the firm have? For most practices, the answer is operational efficiency and service quality. Document that you considered alternatives (hiring additional staff, using non-AI tools) and explain why AI processing is proportionate to the purpose.

Section 3: Risk assessment. Be specific. The risks for a tax practice include: unauthorized disclosure of Mandantendaten, cross-border transfer to the US without adequate safeguards, potential use of data for model training on incorrect license tiers, and employees entering data into unauthorized tools. Rate each risk for likelihood and severity.

Section 4: Mitigation measures. For each identified risk, document what you are doing about it. Enterprise license with EU data residency. Internal AI usage policy. Staff training. PII redaction procedures. Sensitivity labels on documents. Access controls in SharePoint.

Section 5: Residual risk. After all mitigation measures, some risk remains. Document it honestly. If the residual risk is acceptable given the benefits, say so and explain why. If it is not, you need additional measures or you should not proceed with the processing.

Section 6: DPO consultation. Document that your data protection officer was consulted and their opinion. If you do not have an internal DPO, document consultation with your external data protection advisor.

Microsoft provides a DPIA template specifically for Copilot. OpenAI provides one for ChatGPT Enterprise. Both are starting points, not finished documents. They need to be adapted to your firm’s specific situation, data types, and risk profile.

What Your Firm Should Do Now

Five concrete steps:

  1. Inventory: Which AI tools are your employees already using? Include privately installed apps and browser extensions. Shadow AI is the biggest risk.
  2. License check: Is a DPA in place? Is data used for training? Where is data processed?
  3. Internal policy: What can be entered into AI tools and what can’t? Be specific with examples, not just “use common sense.”
  4. DPIA: Document the processing, the risks, and the safeguards. Not a project. A document.
  5. Copilot configuration: If using M365 Copilot, set up sensitivity labels, audit permissions, enable DLP policies.

None of these steps take months. Most firms can complete them in two to four weeks once they know what to do.

What to Do Monday Morning

If you have read this far and your firm has not formalized any of the above, here is the priority order:

Week 1: Shadow AI inventory. Send a brief survey to all staff asking which AI tools they use for work. Include browser extensions and mobile apps. You will likely discover that ChatGPT is already being used with client data. That is not a disaster. It is the starting point for building proper controls.

Week 1: License decision. Based on the inventory, decide on a standard tool and license tier. For most small to mid-size tax practices, ChatGPT Team is the pragmatic starting point. Sign the DPA. Disable any personal accounts being used for work.

Week 2: AI usage policy. Write the internal policy. It does not need to be long. One to two pages covering: approved tools, what data types may be entered, what must be redacted, how to report incidents, and consequences for violations. Circulate it. Have every employee sign it.

Week 2-3: DPIA. Complete the data protection impact assessment using the structure above. Consult your DPO or external data protection advisor. File it where your compliance documentation lives.

Week 3-4: Copilot configuration (if applicable). If your firm uses M365 and Copilot is active or planned, audit SharePoint permissions, deploy sensitivity labels, and configure DLP policies. This is the most technical step and may require IT support if your firm does not have internal expertise.

Ongoing: Monitor and review. Set a calendar reminder to review OpenAI’s and Microsoft’s terms quarterly. Check subprocessor lists. Update the DPIA when your usage changes. Compliance is not a project with an end date. It is a practice.

The firms that are doing this well did not hire a team of consultants for six months. They made a few deliberate decisions, documented them, and moved on with their work. That is all this takes.


Not sure where your firm stands? The AI Compliance Check takes 2 minutes and shows where action is needed.

Prefer to talk directly? Book a free 30-minute consultation — no sales pitch, just an honest assessment.


Jose Lugo is a CISSP-certified AI compliance consultant with M365 Endpoint Administrator certification. He advises tax advisors and law firms in Germany on GDPR-compliant use of AI tools.