The Real Cost of Doing Nothing About AI

GDPR fines, criminal liability under Section 203 StGB, EU AI Act sanctions. What AI compliance inaction actually costs.

The Real Cost of Doing Nothing About AI

Most firms I talk to think about compliance costs first. What does an audit cost? What does configuration cost? What does training cost? Fair questions. But they’re doing the wrong math.

The right question is: What does it cost if you do nothing?

I see this pattern constantly. Managing partners and firm owners who push AI compliance to “someday.” Not out of ignorance, but because other things feel more urgent. Client acquisition. Daily operations. Staffing. AI compliance lands on the list but never rises to the top.

Then something happens. An employee pastes client data into ChatGPT. Or the regulator announces an industry-wide audit. Or a client asks how the firm handles their data when AI tools are in play. Suddenly “someday” is today.

The GDPR Side: Article 83

The fine framework is in the law. Article 83(5) GDPR allows penalties up to EUR 20 million or 4% of global annual turnover, whichever is higher. That covers serious violations: processing principles, data subject rights, third-country transfers.

For a missing data protection impact assessment (DPIA) under Article 35, the ceiling is EUR 10 million or 2% of turnover. And a DPIA is practically always required for AI tools that process personal data.

These are maximum amounts. But real fines are being imposed. Italy’s Garante fined OpenAI EUR 15 million. Spain’s AEPD hit CaixaBank with a EUR 6 million fine for profiling without a proper impact assessment.

For a firm with five or fifteen employees, EUR 20 million is theoretical. But EUR 50,000 or EUR 100,000 is not. And regulators have named AI processing as an audit priority. The DSK (Germany’s data protection conference), the BfDI, and state data protection authorities are paying attention.

Shadow AI as a Data Breach

IBM’s Cost of a Data Breach Report 2024 puts the average cost of a data breach at USD 4.88 million globally. Germany tracks similarly.

Now imagine this: an employee copies client data into the free version of ChatGPT. Draft contracts, financial data, names and addresses. The free version uses inputs for training by default. That data becomes part of the training dataset. Without a data processing agreement, without a legal basis, without your control.

That is a data breach. And it triggers obligations. Article 33 GDPR requires notification to the supervisory authority within 72 hours. Article 34 may require notifying the affected individuals. Those are your clients.

The consequences go beyond fines. Reputation damage when clients learn their data ended up in a US AI model. Potential liability claims. And the cost of damage control: forensic analysis, notification procedures, legal fees, crisis response.

I wrote in detail about why shadow AI in companies is the largest uncontrolled risk. The short version: if you don’t know what your employees are using, you can’t secure it.

The Criminal Dimension: Section 203 StGB

For professionals under secrecy obligations, there is another layer that goes beyond fines. Section 203 StGB protects professional secrets. Lawyers, tax advisors, auditors, physicians. Anyone who unlawfully discloses secrets entrusted to them in their professional capacity commits a criminal offense.

The penalty: up to one year of imprisonment or a criminal fine.

If an employee at your firm enters client data into an AI tool that has no proper data processing agreement and processes data outside the EU, that constitutes disclosure under Section 203. Whether the employee knew this or not is secondary when it comes to the firm’s organizational responsibility. The duty to ensure proper safeguards lies with you.

In the worst case, it is not just a fine at stake. It is your professional license.

EU AI Act: Sanctions From August 2026

The EU AI Act brings its own sanctions regime. Article 99 allows fines up to EUR 35 million or 7% of global annual turnover for the most serious violations.

The AI literacy obligation under Article 4 has been in effect since February 2, 2025. National enforcement begins August 2, 2026. Specific fine amounts for the literacy obligation alone are not yet set. But the law gives national supervisory authorities the latitude.

And depending on how your firm uses AI, certain applications may fall under the high-risk classification. Legal advisory, credit scoring, AI-assisted personnel decisions. That triggers additional obligations: risk management system, data governance, technical documentation, transparency requirements. Non-compliance puts you in the enforcement zone from August 2026.

The Counter-Calculation: What Compliance Costs

Now the other side. A Copilot Compliance Audit, meaning the systematic review of your AI usage for GDPR and EU AI Act conformity, costs a fraction of a single fine.

I am talking about an investment that wraps up in days. Not months. Create the DPIA, review permissions, configure sensitivity labels, document the usage policy, conduct staff training. That is manageable.

Compare that with a single data breach. 72-hour notification obligation. Legal counsel. Forensic analysis. Client notification. Reputation management. Possibly a regulatory proceeding. Possibly a fine.

The math is not close. The cost of inaction is orders of magnitude higher than the cost of preparation.

And that does not even factor in the productivity gain. Companies that introduce AI properly, with compliance architecture, approved tools, and trained staff, benefit from AI. Safely, controlled, and without the sword of Damocles of an incident hanging overhead.

What This Means for You

If you currently have AI tools in use without a DPIA, without a usage policy, without competency documentation: you are operating in a space where risks are accumulating. Not theoretically. Regulators have announced that AI processing is an audit priority. The deadlines are running.

If you don’t have AI tools in use yet, but have employees who privately use ChatGPT: you probably already have shadow AI in your organization. And with it exactly the risks I described above. You just don’t know it yet.

In both cases, the answer is the same: inventory, assessment, safeguards. This is not a major project. It is a structured process that closes the biggest gaps in a matter of days.

I offer a free 30-minute assessment. No sales pitch. Just an honest evaluation of where you stand and what would make sense as a next step.

Book a consultation: 30-minute AI compliance assessment


Jose Lugo is a CISSP-certified AI compliance consultant based in Germany. He helps tax advisors, law firms, and financial services firms deploy AI tools in compliance with GDPR. Learn more at joselugo.de/en and Services.