AI and Client Data: What §203 StGB Means for Your Firm

Client data and AI tools — why Germany's §203 StGB goes beyond GDPR and what tax advisors, lawyers, and firms need to know before using ChatGPT.

AI in Professional Firms — the Problem Nobody Talks About

AI can fundamentally change how firms operate. Reviewing invoices, summarizing contracts, triaging client inquiries — it sounds like a massive efficiency gain. And it is.

But client data is not ordinary data.

I see this regularly: tax advisors and lawyers who want to use ChatGPT for their daily work. Who’ve looked into GDPR. Who think that covers it. It doesn’t.

Because client data and AI tools don’t just touch GDPR — they touch criminal law. Specifically: §203 of the German Criminal Code (StGB). And that’s a distinction many only understand when it’s too late.

What §203 StGB Has to Do with AI and Client Data

Tax advisors, lawyers, and auditors are professional secret-keepers under German law. Everything a client entrusts to them is protected by criminal-law professional secrecy.

§203 StGB states in essence: anyone who, as a professional secret-keeper, unlawfully discloses another person’s secret is committing a criminal offense. The penalty: up to one year of imprisonment or a fine.

This is not an administrative violation like a GDPR infringement. This is a criminal offense. The difference is significant.

And “disclose” doesn’t mean someone actively reads the data. It’s enough that an unauthorized third party gains access. Technically speaking: that the data leaves the firm’s protected sphere.

Why Cloud AI Counts as a Third Party

When an employee enters client data into ChatGPT, here’s what happens: the data is transmitted to OpenAI — a US company. OpenAI processes this data on its servers. With the free version and Plus/Pro tiers, inputs are used for model training by default. A Data Processing Agreement (DPA) is not available on these tiers.

Even with the Business and Enterprise tiers, which have offered a DPA since 2024, the §203 question remains. Because GDPR compliance and §203 compliance are two separate requirements.

The 2017 reform introduced §203 paragraph 3. Since then, professional secret-keepers may share data with “other contributors” — under strict conditions:

  • The service provider must be contractually bound to secrecy.
  • The firm must maintain oversight.
  • Access must be necessary for delivering the service.
  • The service provider must be informed about the criminal consequences.

The critical question: does a US AI provider processing your inputs on global infrastructure meet these requirements? The honest answer: not in the default configuration.

What the 2017 Reform Actually Changed

Before 2017, the situation was simpler and more restrictive. Professional secret-keepers could not share client data with external service providers without client consent, period. IT outsourcing for law firms and tax advisors existed in a legal gray area.

The reform added paragraph 3 to Section 203 StGB. It created a legal path for sharing data with “mitwirkende Personen” (other contributors) without individual client consent. This was designed primarily for IT service providers, cloud hosting, and outsourced infrastructure.

The conditions are specific:

  1. The service provider must be contractually obligated to maintain secrecy. A standard terms-of-service agreement does not satisfy this. The contract must explicitly reference the secrecy obligation and its criminal-law consequences.
  2. The professional secret-keeper must supervise the service provider. This means active oversight, not a one-time vendor check. You need to know what the provider does with the data, where it is processed, and who has access.
  3. The data sharing must be necessary for the professional activity. Convenience does not count. If you can accomplish the same task without sharing client data with a third party, the necessity argument weakens.
  4. The service provider must be informed about the criminal consequences of a breach. This notification must be documented.

AI providers generally satisfy none of these conditions in their default configuration. OpenAI’s business terms address GDPR obligations. They do not reference German criminal law. Microsoft’s DPA covers data processing under GDPR. It does not include a Section 203 secrecy clause.

That does not mean you cannot use these tools. It means additional contractual work is required, and the default DPA is not enough by itself.

The Dual Compliance Problem for Tax Advisors and Law Firms

This is where it gets concretely uncomfortable for firms. You need both simultaneously:

GDPR compliance requires, among other things, a Data Processing Agreement per Article 28, a Data Protection Impact Assessment for high-risk processing, Standard Contractual Clauses for third-country transfers, and a documented legal basis.

§203 StGB compliance additionally requires a service provider agreement with a secrecy obligation, notification of the service provider about criminal consequences, confirmation that data sharing is necessary, and ongoing due diligence in oversight.

Most AI providers address GDPR. Almost none address §203 StGB. This isn’t malicious — it’s because German professional secrecy is a peculiarity that simply doesn’t exist in international markets.

Italy’s data protection authority Garante fined OpenAI EUR 15 million in December 2024 — for GDPR violations. The EDPB ChatGPT Taskforce also documented fundamental data protection concerns in its May 2024 report. That only covers the GDPR layer. For professional secret-keepers in Germany, the §203 layer comes on top.

What Compliant AI Usage Looks Like in a Firm

AI is not prohibited for firms. But the requirements are higher than for a regular business. Anyone who wants to process client data with AI needs a solution that covers both compliance layers.

In practice, that means:

EU-hosted infrastructure. Data must not leave European jurisdiction. ChatGPT offers EU Data Residency only for Enterprise and Education tiers and via the API — not for standard tiers.

PII redaction before processing. Before client data reaches an AI model, personally identifiable information must be redacted. Names, tax IDs, case numbers — anything that could identify a person.

Dedicated instance instead of multi-tenant. A firm working with client data should not run on the same infrastructure as millions of other users.

DPA plus §203 service provider agreement. Both contracts. Not just one.

Full audit trail. Who submitted which query, and when? This must be documented and traceable.

Staff training. Everyone in the firm needs to know which data must not be entered into AI tools. It sounds obvious — in practice, it’s the most common weak point.

A Practical Decision Tree for Your Firm

Before using any AI tool with client data, walk through this checklist:

Step 1: Does the tool have a DPA? If no, stop. The tool is not suitable for client data under GDPR Article 28. Free ChatGPT, Plus, and Pro do not have DPAs.

Step 2: Does the DPA include or can it be supplemented with a Section 203 secrecy clause? If no, you need a supplementary agreement. Contact the provider or your legal advisor. Without this, the GDPR layer is covered but the criminal law layer is not.

Step 3: Where is data processed? If outside the EU, you need Standard Contractual Clauses and a Transfer Impact Assessment. For Section 203 data, consider whether US processing is acceptable at all given the sensitivity. EU data residency is strongly preferable.

Step 4: Is data used for model training? If yes at any tier, that tier is not suitable for client data. Training means your inputs are used to improve the model, which means they persist in the system in some form. Team, Business, and Enterprise tiers at OpenAI do not train on your data by default. Verify this in writing.

Step 5: Can you implement PII redaction? Before client data reaches the AI model, personally identifiable information should be stripped. Names, tax IDs, case numbers, addresses. The AI does not need to know that the tax assessment belongs to Hans Mueller at Rosenstrasse 12. It needs the numbers and the question.

Step 6: Do you have an internal policy and training? The best technical safeguards fail if staff do not know the rules. A 15-minute briefing with specific examples is worth more than a 40-page policy nobody reads.

If you can answer yes to all six steps, you have a defensible compliance posture for both GDPR and Section 203. If you cannot, identify the gaps and close them before processing client data.

What Enforcement Looks Like

The Italian Garante’s EUR 15 million fine against OpenAI in December 2024 targeted GDPR violations specifically: lack of legal basis for processing, insufficient transparency to users, and inadequate age verification. That was a GDPR-only enforcement action.

Section 203 enforcement works differently. It is a criminal offense prosecuted under German criminal law, not an administrative fine issued by a data protection authority. A client who believes their data was disclosed unlawfully can file a criminal complaint (Strafanzeige). The public prosecutor investigates. If charges are brought, the professional faces criminal proceedings.

This is rare but not theoretical. Most Section 203 cases involve deliberate disclosure or gross negligence rather than AI tool misconfiguration. But the law does not distinguish between intentional and negligent disclosure. If client data reaches an unauthorized third party because the firm used an improperly configured AI tool, the firm is exposed.

The practical risk is not that prosecutors are scanning ChatGPT logs. The practical risk is that a client dispute, a disgruntled employee, or a data breach investigation reveals that client data was processed without adequate safeguards. At that point, the question is not whether the firm intended to violate Section 203. The question is whether the firm had documented safeguards in place.

Want to know where your firm stands? My free AI Compliance Check assesses in 2 minutes whether your AI usage meets the requirements.

The Path Forward

In forums and conversations, I keep hearing the same question: “How do you handle data protection? You can’t just hand client data to some US company.” The question is valid. And the answer is not: avoid AI. The answer is: use AI correctly.

The firms that are already doing this well don’t use consumer tools with a training toggle switched off. They use purpose-built, EU-hosted solutions with proper contracts — DPA and §203 agreement. They have clear internal rules about what may and may not be entered into an AI tool. And they have someone who regularly verifies compliance.

AI will change how firms operate. The question is not whether, but how. And “how” means for professional secret-keepers: with a solution that satisfies both GDPR and §203 StGB.

I’ve spent the majority of my career protecting sensitive data in environments where mistakes weren’t an option. The requirements for professional firms are no lower. But they are solvable — if you ask the right questions from the start.


Where does your firm stand? Take the free AI Compliance Check — 7 questions, 2 minutes, instant assessment and actionable recommendations.