5 GDPR Questions Your Firm Must Answer Before Rolling Out Copilot

Microsoft 365 Copilot is powerful. But most firms skip the compliance basics. DPIA, sensitivity labels, Section 203 StGB, SharePoint permissions, and subprocessors — a CISSP breaks down what needs to be in place before activation.

Microsoft 365 Copilot is rolling out across organizations in Germany. Most of them already have the M365 infrastructure. Many are getting Copilot licenses bundled in or pushed by their Microsoft reps.

The technical setup is straightforward. The compliance work that should come with it usually doesn’t.

That’s the gap. Copilot isn’t a standalone tool employees install on their own. It sits inside your M365 environment with access to SharePoint, OneDrive, emails, and Teams conversations. That access is what makes it useful. It’s also what creates GDPR obligations that most firms haven’t addressed.

Here are five questions that need answers before Copilot goes into production. Not a scare list. A practical checklist for making an informed decision.

1. Have you completed a data protection impact assessment?

GDPR Article 35 requires a DPIA for processing that poses a high risk to individuals’ rights and freedoms. AI-driven document processing qualifies. This isn’t a gray area.

Microsoft provides a DPIA template. The document explicitly states that the controller must complete their own assessment. Microsoft is the processor. Your firm is the controller. The obligation is yours.

A proper DPIA for Copilot covers:

  • What data flows occur when Copilot accesses SharePoint, OneDrive, and Exchange
  • What risks arise from the processing (access to confidential files, unintended disclosure, third-party processing)
  • What technical and organizational measures mitigate those risks (sensitivity labels, permissions structure, DLP policies)
  • Whether the data protection officer was consulted

Your regulator can request this document at any time. If you don’t have one, you’re missing the central compliance record for your AI deployment.

How to complete the DPIA in practice

Start with Microsoft’s DPIA template for Copilot. It covers the data flows on Microsoft’s side. But the template is a starting point, not a finished document.

You need to add: your firm’s specific data categories (client contracts, tax records, employee files), the sensitivity classification of each category, and the specific risks that arise when Copilot accesses those categories. Then document the mitigation measures you have in place — or plan to put in place — for each risk.

The DPO consultation is not optional. GDPR Article 35(2) requires that you seek the advice of your data protection officer. If you use an external DPO, schedule a meeting to walk through the DPIA together. Document the consultation and any recommendations.

A realistic timeline: one to two weeks for a thorough DPIA, including the DPO consultation. It is not a six-month project. But it does require someone who understands both the M365 architecture and the GDPR requirements.

2. Who has access to what in SharePoint?

Copilot searches everything a user can access in SharePoint and OneDrive. It doesn’t create new permissions. But it makes existing permissions visible.

At most firms, SharePoint permissions were set up years ago and haven’t been systematically reviewed since. People have access to folders they probably shouldn’t see. As long as nobody actively looks, that doesn’t surface.

Copilot actively looks. When an employee asks “show me the latest contracts for Client X,” Copilot returns everything it finds. Including files sitting in a team folder that was created three years ago and never restricted.

That’s not a Copilot bug. It’s doing exactly what it’s designed to do. The problem is the permissions.

The fix is a permissions audit before (or immediately after) Copilot activation. Who has access to which SharePoint sites and libraries? Which groups are too broadly defined? Where are confidential client files sitting with access open to people who don’t need them?

Running a SharePoint permissions audit

Here is the practical approach:

  1. Export current permissions. Use the SharePoint admin center or PowerShell to generate a permissions report for all sites and libraries. The goal is a complete picture of who can access what.
  2. Identify overly broad groups. Look for groups like “All Company” or “Everyone except external users” that grant access to sensitive document libraries. These are the highest-risk findings.
  3. Map access to job roles. For each SharePoint site containing client data, ask: does every person with access actually need it for their work? The principle of least privilege is not new, but most firms have never applied it systematically to SharePoint.
  4. Remediate before activation. Remove unnecessary access. Create more granular groups. Move sensitive documents to restricted libraries. This is the single most impactful step you can take before turning on Copilot.
  5. Document the changes. The permissions audit and the remediation steps become part of your DPIA evidence. They demonstrate that you took GDPR Article 25 (data protection by design) seriously.

This is not glamorous work. It is tedious, detail-oriented, and often reveals years of accumulated access creep. But it is the difference between Copilot being a controlled tool and Copilot being a data exposure risk.

3. Are your sensitivity labels properly configured?

Sensitivity labels are Microsoft’s mechanism for classifying and protecting documents. They control which files Copilot can process, how content can be shared, and whether encryption applies. Without labels, Copilot treats every document the same way. Confidential client files and the office lunch menu get identical treatment.

Labels need to match the industry. A tax advisory firm needs different classifications than a marketing agency. Client-related documents, internal working papers, general information — those distinctions need to be built into the label structure. And they need to be enforced, not just suggested.

Auto-labeling rules help. They detect patterns in documents (tax IDs, client references, contract dates) and apply labels automatically, rather than relying on manual classification by every employee.

A note on reliability: sensitivity labels had two documented incidents in eight months. EchoLeak in June 2025 and a second incident (CW1226324) in January 2026 showed that labels as an access control aren’t infallible. That doesn’t mean labels are useless. It means they’re one protective layer, not the only one.

Building a label taxonomy for regulated firms

A label structure for a tax practice or law firm looks different from what Microsoft ships as defaults. Here is a practical taxonomy:

  • Public. Marketing materials, published articles, job postings. No restrictions on Copilot access or sharing.
  • Internal. General business documents, meeting notes, internal announcements. Copilot can access. External sharing blocked.
  • Confidential — Client Data. Any document containing client-identifiable information. Copilot access restricted to authorized personnel. Encryption applied. External sharing requires approval.
  • Highly Confidential — Section 203. Documents explicitly covered by professional secrecy obligations. Copilot access limited to the specific team handling the matter. Encryption enforced. No external sharing. DLP policies prevent forwarding or printing without authorization.

Auto-labeling rules should scan for patterns: tax ID numbers (Steuernummer format), client reference numbers, contract dates combined with party names. These rules catch documents that employees forget to label manually.

The combination of manual labeling for new documents and auto-labeling for existing content gives you reasonable coverage. Perfect coverage is not realistic. The goal is reducing the surface area of unlabeled sensitive documents to an acceptable level.

4. Does Section 203 StGB apply to your firm?

Section 203 of the German Criminal Code covers the disclosure of private secrets by professionals under secrecy obligations. That includes tax advisors, lawyers, physicians, and auditors. If your firm falls under Section 203, you have a criminal secrecy obligation regarding client data. Not just a GDPR administrative fine risk. A criminal one.

If Copilot processes document content that falls under Section 203, the firm must ensure that:

  • Only authorized personnel have access to the relevant files (permissions structure)
  • The files are classified as confidential (sensitivity labels)
  • DLP policies prevent confidential client data from being shared externally
  • The architecture is documented so that protective measures can be demonstrated during an audit

The responsibility for this architecture sits with the firm. Not with Microsoft. Microsoft provides the tools. How they’re configured is the firm’s decision.

For firms not covered by Section 203, this point is less urgent. But GDPR requirements for confidentiality and access control still apply.

Practical steps for Section 203 firms deploying Copilot

If your firm falls under Section 203, Copilot deployment needs additional safeguards beyond standard GDPR compliance:

  1. Document the Section 203 analysis. Before activation, create a written assessment of how Copilot interacts with data covered by professional secrecy. Which document libraries contain Section 203 data? How is Copilot’s access to those libraries controlled?
  2. Apply the Highly Confidential label to all client-facing data. Use the label taxonomy above. Copilot should not be able to surface Section 203 documents in response to casual queries from staff who are not working on that client matter.
  3. Configure DLP policies. Data Loss Prevention policies should prevent Section 203 data from being shared via email, Teams, or any external channel without explicit approval. DLP works alongside sensitivity labels to create a second layer of protection.
  4. Brief all staff. Not a generic compliance training. A specific briefing: what Section 203 means, why Copilot changes the risk profile, what the consequences are for violations, and what the firm expects from each employee.
  5. Review the architecture quarterly. Section 203 compliance is not a one-time configuration. Microsoft updates Copilot’s capabilities. New features may access new data sources. Each update requires a reassessment of whether the protective measures still hold.

The BayLDA (Bavarian Data Protection Authority) has been increasingly active in reviewing AI deployments. Having a documented Section 203 analysis alongside your DPIA puts your firm in a defensible position if questioned.

5. Do you know which subprocessors handle your data?

Microsoft uses a network of subprocessors for Copilot. In January 2026, Anthropic was added as a subprocessor. Anthropic is excluded from Microsoft’s EU Data Boundary. That means data processed by Anthropic doesn’t automatically fall under the same data residency rules as the rest of your M365 processing.

GDPR Article 28 requires the processor to inform the controller about changes to subprocessors. Microsoft does that. But the question is whether your firm actually tracks and evaluates those changes.

Each new subprocessor technically requires an updated risk assessment. If the last time you checked Microsoft’s subprocessor list was more than six months ago, you may be missing a material change.

Monitoring these changes isn’t a one-time task. It’s ongoing compliance work.

How to track subprocessor changes

Microsoft publishes its subprocessor list and sends notifications of changes. The practical challenge is that someone at your firm needs to actually read those notifications and assess the implications.

Set up a simple process:

  1. Subscribe to Microsoft’s subprocessor notifications. These come via the Microsoft 365 admin center. Assign a specific person (your DPO, your IT lead, or yourself) as the recipient.
  2. Evaluate each change. When a new subprocessor is added, ask: does this change affect where our data is processed? Does the new subprocessor fall within the EU Data Boundary? What data does the subprocessor handle?
  3. Update your records of processing activities (ROPA). GDPR Article 30 requires you to maintain records of processing activities. Subprocessor changes are part of that record.
  4. Reassess risk if needed. A subprocessor outside the EU Data Boundary may require an updated Transfer Impact Assessment. Anthropic’s addition as a subprocessor in January 2026, excluded from the EU Data Boundary, is a concrete example. If your firm processes sensitive data through Copilot features that route through Anthropic, your risk assessment needs to reflect that.

This sounds like a lot of overhead. In practice, Microsoft changes its subprocessor list a few times per year. Each change takes 30 minutes to evaluate and document. The alternative is finding out about a material change during an audit, which is substantially worse.

What these questions have in common

None of these five questions challenge Copilot itself. Copilot is a capable tool. But capable tools need proper infrastructure. For a car, that’s inspection and insurance. For Copilot, it’s a DPIA, permissions audit, label architecture, and documented compliance.

The good news: every one of these points is solvable. The DPIA is a document, not a barrier. Permissions can be cleaned up. Labels can be configured. It takes work, but it’s doable.


Not sure where your firm stands? The Copilot Readiness Check takes 2 minutes and provides a first assessment.

Prefer to talk directly? Book a free 30-minute consultation — no sales pitch, just an honest evaluation of your situation.


Jose Lugo is a CISSP-certified AI compliance consultant with M365 Endpoint Administrator certification and 12 years of experience protecting sensitive data. He advises firms in Germany on GDPR-compliant deployment of Microsoft 365 Copilot.