Microsoft announced a suite of enhanced security controls for Copilot for Microsoft 365 on January 31, 2026, directly addressing concerns raised by enterprise security teams about the AI assistant’s ability to surface sensitive data through existing user permissions. The updates include deep integration with Microsoft Purview Data Loss Prevention, mandatory sensitivity label enforcement, and new oversharing assessment dashboards.
Rollout timeline
| Milestone | Date |
|---|
| Preview release | Mid-November 2025 |
| General availability | Mid-January 2026 |
| Admin center integration | January 2026 |
| Full feature rollout | January 31, 2026 |
| DLP for prompts GA | Late March 2026 |
Why oversharing became a problem
Since Copilot for Microsoft 365 became generally available in late 2023, enterprise customers have confronted a fundamental challenge: the AI assistant inherits the same data access permissions as the user who invokes it. In organizations where SharePoint sites, Teams channels, and OneDrive folders have accumulated years of overly permissive sharing configurations, Copilot effectively becomes an accelerant for data exposure.
The permission inheritance problem
| Traditional workflow | Copilot workflow |
|---|
| User must navigate to content | Copilot searches entire corpus |
| Manual discovery of files | AI-assisted discovery |
| Limited by time and attention | Unlimited search capacity |
| Permission issues rarely exploited | Permission issues surfaced instantly |
The concern is not that Copilot bypasses security controls, but that it makes existing permission sprawl visible and exploitable in ways that were previously impractical. A user who technically had access to a sensitive SharePoint site but never navigated there manually might now receive excerpts from confidential documents in a Copilot-generated summary.
The scale of the problem
| Metric | Finding | Source |
|---|
| Average files accessible per employee | ~17 million | Varonis (2025) |
| Files containing sensitive data | ~20% | Varonis (2025) |
| Sensitive data types | PII, financial records, IP | Industry analysis |
| Organizations with oversharing issues | 90%+ | Cloud Security Alliance |
Research published by Varonis in late 2025 found that in a typical enterprise Microsoft 365 environment, the average employee has access to approximately 17 million files, and roughly 20% of those files contain sensitive data. Copilot’s ability to search across this entire corpus and synthesize results means that permission misconfigurations that were previously low-risk become material data exposure vectors.
CISO concerns (IANS Research December 2025)
| Metric | Finding |
|---|
| CISOs citing Copilot data access as top-5 concern | 62% |
| Organizations that delayed/restricted Copilot deployment | 34% |
| Organizations surveyed | 5,000+ employees |
New security controls
Microsoft’s updated security framework introduces several layers of protection designed to mitigate oversharing risks without requiring organizations to overhaul their permission structures from scratch.
Purview DLP integration
Copilot for Microsoft 365 now respects Microsoft Purview DLP policies in real time:
| Capability | Function |
|---|
| Prompt scanning | Detects sensitive information types in user queries |
| Response blocking | Prevents Copilot from returning responses containing sensitive data |
| Web search protection | Blocks sensitive data from external search queries |
| Pre-built agents | DLP applies to Microsoft 365 Copilot agents |
| Real-time enforcement | Policies evaluated at interaction time |
Previously, DLP policies were enforced at the point of sharing or exfiltration (sending an email or uploading to an external service), but not at the point of AI-assisted retrieval. The new integration closes this gap.
DLP integration protects against exposure of:
| Category | Examples |
|---|
| Financial | Credit card numbers, bank accounts |
| Government IDs | Social Security numbers, passport identification |
| Healthcare | HIPAA-protected information |
| Custom SITs | Organization-defined patterns |
| Regulatory data | GDPR, PCI-DSS patterns |
| Intellectual property | Custom-defined confidential content |
Sensitivity label enforcement
Organizations using Microsoft Purview Information Protection sensitivity labels can now configure Copilot to enforce label-based restrictions:
| Label | Restriction |
|---|
| Highly Confidential | Cannot be surfaced to users outside designated audience |
| Internal Only | Hidden in summaries shared with external guests |
| Restricted | Require explicit confirmation before inclusion |
| Custom classifications | Organization-defined rules |
Data Security Posture Management for AI (DSPM)
| Feature | Function |
|---|
| Weekly risk assessment | Automatically scans top 100 SharePoint sites by usage |
| Custom assessments | Supplement with organization-defined scopes |
| Oversharing identification | Flags potential data exposure risks |
| Remediation guidance | Prioritized recommendations |
| Monitoring dashboard | Ongoing visibility |
Oversharing Assessment Dashboard
A new Copilot Oversharing Assessment tool in the Microsoft 365 admin center provides visibility into potential data exposure:
| Feature | Function |
|---|
| Risk identification | Sites with broad access containing sensitive content |
| User analysis | Users whose queries frequently surface external team content |
| Document tracking | Documents surfaced disproportionately relative to intended audience |
| Remediation guidance | Prioritized recommendations by risk level |
| Trend analysis | Track improvement over time |
SharePoint Advanced Management integration
| Control | Description |
|---|
| Data access governance reports | Identify potentially overshared sites |
| Restricted Content Discovery | Flag sites to hide from Copilot and Org-wide search |
| Permission remediation | Targeted fixes for high-risk configurations |
| Restricted access control policy | Limit site access to specific groups |
| Option | Function |
|---|
| Restrict access by label | DLP policy prevents Copilot from summarizing labeled content |
| Restrict all items | SharePoint Restricted Content Discovery hides sites |
| Per-site controls | Granular site-level protection |
Required admin roles
| Role | Permissions |
|---|
| Entra AI Admin | Manage all Microsoft 365 Copilot and AI enterprise services |
| Purview Data Security AI Admin | Edit DLP policies, view AI content in DSPM |
| Global Administrator | Full access to all controls |
| Compliance Administrator | DLP policy management |
| SharePoint Administrator | Site-level controls and RCD |
How security leaders are reacting
The announcement has been generally well received, though many noted the controls arrive later than desired.
“These controls should have been available at launch. But credit to Microsoft for building them into the platform rather than expecting customers to solve the problem with third-party tools alone. The DLP integration is particularly important because it means you do not have to fix every permission issue before rolling out Copilot safely.”
— Geoff Belknap, security advisor and former LinkedIn CISO
Cloud Security Alliance guidance
The CSA published guidance noting that the new controls are most effective for organizations that have already invested in data classification and labeling. For enterprises that have not yet classified their data or deployed sensitivity labels, the controls provide limited immediate benefit.
| Readiness factor | Control effectiveness |
|---|
| Mature data classification | High |
| Sensitivity labels deployed | High |
| No classification program | Limited |
| Permission governance in place | Higher |
| Legacy permission debt | Lower initial impact |
Availability and licensing
| Feature | Requirement |
|---|
| DLP integration | Microsoft 365 E5 or Purview add-on |
| Sensitivity label enforcement | Microsoft 365 E5 or Purview add-on |
| Oversharing Assessment dashboard | Public preview (no E5 required) |
| Dashboard GA | Expected March 2026 |
| DSPM for AI | Purview license required |
Deployment guide recommendations
Microsoft’s Copilot Security Deployment Guide emphasizes a phased approach:
| Phase | Action | Timeline |
|---|
| 1. Assess | Use Oversharing Assessment to identify permission risks | Week 1-2 |
| 2. Remediate | Fix highest-risk permission misconfigurations | Week 2-4 |
| 3. Classify | Deploy sensitivity labels on sensitive content | Week 3-6 |
| 4. Enforce | Enable DLP and sensitivity label policies | Week 5-8 |
| 5. Monitor | Use Purview audit logs and admin dashboard ongoing | Continuous |
Implications beyond Microsoft
The Copilot security enhancements reflect a broader challenge facing every organization deploying AI assistants:
| Vendor | Product | Same risk? |
|---|
| Google | Gemini for Workspace | Yes |
| Salesforce | Einstein Copilot | Yes |
| Slack | Slack AI | Yes |
| Notion | Notion AI | Yes |
| Others | Enterprise AI assistants | Yes |
The underlying issue—that most enterprise environments carry significant permission debt accumulated over decades of collaborative work—is not something any single vendor can solve with technology alone. Security practitioners have emphasized that AI deployment must be accompanied by data governance maturation.
Scale of impact
| Metric | Value |
|---|
| Microsoft 365 paid seats | 400+ million globally |
| Copilot for Microsoft 365 licenses | Tens of millions |
| Enterprise deployments | Thousands of organizations |
The security controls announced this week will define baseline expectations for how AI assistants should handle sensitive enterprise data across the industry.
Recommendations
For organizations deploying Copilot
| Priority | Action |
|---|
| Immediate | Run Oversharing Assessment |
| Immediate | Review DSPM for AI findings |
| High | Enable DLP policies for Copilot prompts |
| High | Deploy sensitivity labels on sensitive content |
| Medium | Review SharePoint site permissions |
| Medium | Configure Restricted Content Discovery |
| Ongoing | Monitor Copilot access patterns in Purview |
For organizations planning deployment
| Step | Consideration |
|---|
| Pre-deployment | Complete data classification project |
| Pre-deployment | Remediate obvious permission issues |
| Pre-deployment | Deploy sensitivity labels |
| Pilot | Start with security-conscious teams |
| Pilot | Monitor for unexpected data surfacing |
| Expand | Roll out with DLP enforcement active |
Data governance prerequisites
| Requirement | Importance |
|---|
| Data classification taxonomy | Critical |
| Sensitivity label deployment | Critical |
| Permission review process | High |
| Site ownership accountability | High |
| Regular access reviews | Medium |
Context
The controls address a legitimate concern, but organizations should not assume enabling them eliminates oversharing risk. The fundamental work of data governance—classification, access reviews, and least privilege—remains essential.
| Reality | Implication |
|---|
| Controls are reactive | Prevention requires governance |
| Labels must exist to work | Classification investment required |
| Permissions still matter | Least privilege remains goal |
| Monitoring is ongoing | Not a one-time fix |
Microsoft’s release of these controls acknowledges that AI assistants fundamentally change the risk calculus for enterprise data. What was previously theoretical exposure (a user could access a file if they knew where to look) becomes practical exposure (Copilot finds and surfaces that file automatically).
Organizations that have invested in data governance are well-positioned to benefit from these controls. Those that haven’t face a choice: defer AI adoption until governance catches up, or accept elevated risk during deployment. The new tools at least provide visibility into that risk, which is more than organizations had before.