DPIA for Microsoft 365 Copilot: What the Document Must Contain

GDPR Article 35 requires a DPIA for Copilot. Microsoft's template is a start, not a finish. Here's what actually needs to be in it.

GDPR Article 35 requires a data protection impact assessment for processing that poses a high risk to individuals’ rights and freedoms. AI-driven document processing by Microsoft 365 Copilot qualifies.

The BayLDA (Bavaria’s data protection authority) has AI systems on its positive list for mandatory DPIAs. If you use Copilot without a DPIA, you’re missing a legally required document.

The question isn’t whether you need one. The question is what goes into it. Because most DPIAs I come across are either Microsoft’s template used without modification or internal documents missing key elements. This post walks through every component Article 35 requires and shows what that means for Copilot in practice.

Microsoft’s DPIA template: starting point, not finished product

Microsoft provides a DPIA template for Office 365. The template explicitly states that the controller must complete their own assessment. Microsoft is the processor. Your firm is the controller. The obligation is yours.

The template covers what Microsoft does on its side. What security measures Microsoft has in place. How Microsoft processes and stores data. That’s useful as a reference.

What the template does not cover: your specific data. Your SharePoint structure. Your permissions configuration. Your industry requirements. Your risks.

I use Microsoft’s template as a foundation. But the finished document looks different, because it has to describe your environment, not Microsoft’s.

Systematic description of the processing

Article 35(7)(a) requires a systematic description of the envisaged processing operations. For Copilot, that means: What data sources does Copilot access? What processing does it perform?

The data sources in a typical M365 environment are SharePoint sites and libraries, OneDrive folders, Exchange mailboxes, and Teams channels including chat history. Copilot searches everything the respective user has access to. That’s not a limitation. That’s the design principle.

The processing operations include summarizing documents and emails, drafting content based on existing files, searching and analyzing across multiple data sources, and answering questions about stored data.

For the DPIA, you need to break this down to your environment. Not “SharePoint” but which SharePoint sites. Not “emails” but which mailboxes from which departments. The more specific the description, the more meaningful the risk assessment that builds on it.

Also document which employee groups have Copilot licenses. Many firms start with a pilot group. The DPIA must reflect the actual scope of use, not the theoretical maximum.

Article 35(7)(a) also requires the purposes of processing and, where applicable, the legitimate interests pursued. Why are you using Copilot? What is it supposed to accomplish?

The legal basis is typically Article 6(1)(f) GDPR (legitimate interest) or Article 6(1)(b) (performance of a contract), depending on context. If Copilot assists in preparing client documents, performance of contract may serve as the basis. If it’s used for internal efficiency, legitimate interest is the likely ground.

Document the legal basis for each processing purpose separately. “We use Copilot to improve productivity” is too vague. “Copilot is used to summarize client correspondence (Art. 6(1)(b)) and to draft internal working documents (Art. 6(1)(f))” is more precise.

Necessity and proportionality

Article 35(7)(b) requires an assessment of necessity and proportionality. Is Copilot necessary for the stated purpose? Are there less intrusive alternatives?

This sounds like a formality, but it has practical significance. You need to justify why you’re deploying an AI tool with access to a user’s entire data estate rather than a more targeted solution. The answer might be that Copilot’s integration into M365 is more efficient than an external tool that would require data exports. Or that the alternative (manual search and summarization) takes disproportionate time.

The proportionality assessment isn’t an obstacle. It forces you to consciously justify the deployment. If the justification is sound, it strengthens your position in an audit.

Risks to data subjects

Article 35(7)(c) is the core of the DPIA. What can go wrong? What risks exist for the individuals whose data is processed?

For Copilot, there are several concrete risk categories.

The oversharing risk is the most commonly underestimated. Copilot surfaces content that users technically have access to but would never have found under normal circumstances. Confidential client files, HR documents, strategy papers. Not because Copilot bypasses permissions, but because existing permissions are too broad.

Data exposure through prompts is another risk. When an employee asks Copilot to summarize a document, the content is sent to Microsoft’s AI infrastructure. Since January 2026, Anthropic is a subprocessor within that infrastructure. Data processed through Anthropic is excluded from the EU Data Boundary.

Sensitivity label failures are a documented risk. EchoLeak (June 2025) and incident CW1226324 (January 2026) showed that labels could be bypassed under certain conditions. Labels are a layer of protection, but not an infallible one.

For professional secrecy holders (tax advisors, lawyers, auditors under German law), the Section 203 German Criminal Code risk adds another dimension. When Copilot processes client data subject to criminal confidentiality obligations, the DPIA must assess this risk separately.

Rate each risk by likelihood and severity. Use a matrix your regulator can follow. A four-level scale (low, medium, high, very high) for both dimensions is a solid approach.

Safeguards and mitigation measures

Article 35(7)(d) requires the measures envisaged to address the risks. Every identified risk needs at least one documented countermeasure.

Against oversharing: audit and clean up SharePoint permissions. Remove “everyone” shares. Build groups on the principle of least privilege. This is the foundation. Without clean permissions, every other measure is a patch on a leaky pipe.

Configure sensitivity labels. A four-tier structure (Public, Internal, Confidential, Highly Confidential) covers most regulated firms. Auto-labeling rules for tax IDs, IBANs, and client reference numbers supplement manual classification. Since September 2025, you can explicitly exclude Copilot from specific label tiers.

DLP policies prevent confidential content from being shared externally. For client data protected under Section 203 of the German Criminal Code, DLP policies are part of the documented security architecture.

Employee training belongs in the DPIA as well. Do your employees know what data they can use in Copilot prompts? Do they understand that Copilot results may contain confidential information that must not be shared?

Logging and monitoring: use the Copilot audit logs in the Microsoft Purview Compliance Center. Document what activities are logged and who reviews the logs regularly.

DPO consultation

Article 35(2): the advice of the data protection officer shall be sought when carrying out a DPIA. This is not a recommendation. It is a legal requirement.

Document when the DPO was consulted, what opinion they gave, and whether their recommendations were implemented. Even if the DPO has no objections, that belongs in the DPIA. “DPO was consulted on [date]. No objections raised.” is better than nothing.

If you use an external DPO, schedule a session to walk through the DPIA together. The DPO may not know your environment in detail. Provide the technical information they need for their assessment.

What makes the DPIA firm-specific

This is where template solutions fall short. Your data flows are unique. Your SharePoint structure is unique. Your industry regulation is specific.

A tax advisory firm has different data flows than a law firm. A firm with 10 employees has a different permissions structure than one with 50. A company using only Word and Outlook has a different risk profile than one that uses SharePoint extensively for client files.

The DPIA must describe your environment. Not a theoretical standard M365 setup.

The subprocessor section

Since January 2026, the subprocessor section must list Microsoft as the processor and Anthropic as a subprocessor. For each subprocessor, include processing locations, legal basis for data transfers, and security measures.

This section is not static. When Microsoft adds more subprocessors, the DPIA needs updating. Set up monitoring: subscribe to Microsoft’s subprocessor notifications and define internally who evaluates changes.

Common mistakes

Using Microsoft’s template as a finished document without adding the firm-specific sections. This is the most common mistake. The template covers Microsoft’s processing. Your processing, your risks, and your measures need to come from you.

Not updating the DPIA when processing circumstances change. New subprocessors, new features, changed permissions structures. A DPIA from 2024 that was never updated has limited value in an audit in 2026.

Not involving the DPO. Article 35(2) is clear. Without documented DPO consultation, a mandatory element is missing.

Not documenting the risk assessment process. Listing risks is not enough. You need to show, in a way the regulator can follow, how you arrived at your assessment. What criteria did you apply? What information sources? Who was involved?

Next step

The DPIA is part of every Copilot Compliance Audit. I build firm-specific DPIAs, not template copies. The document describes your environment, your risks, and your measures, not those of a generic M365 installation.

As covered in the 5 GDPR questions before a Copilot rollout: the DPIA is a central compliance document. And as the oversharing post shows, the quality of the DPIA depends directly on how well you know your own environment.

Book a 30-minute call. I’ll look at what you have and tell you what’s missing. No sales pitch.


Jose Lugo is a CISSP-certified AI compliance consultant with M365 Endpoint Administrator certification. He builds data protection impact assessments for regulated firms in Germany.