Microsoft announced a suite of enhanced security controls for Copilot for Microsoft 365 on January 31, 2026, directly addressing concerns raised by enterprise security teams about the AI assistant’s ability to surface sensitive data through existing user permissions. The updates include deep integration with Microsoft Purview Data Loss Prevention, mandatory sensitivity label enforcement, and new oversharing assessment dashboards.

Rollout timeline

MilestoneDate
Preview releaseMid-November 2025
General availabilityMid-January 2026
Admin center integrationJanuary 2026
Full feature rolloutJanuary 31, 2026
DLP for prompts GALate March 2026

Why oversharing became a problem

Since Copilot for Microsoft 365 became generally available in late 2023, enterprise customers have confronted a fundamental challenge: the AI assistant inherits the same data access permissions as the user who invokes it. In organizations where SharePoint sites, Teams channels, and OneDrive folders have accumulated years of overly permissive sharing configurations, Copilot effectively becomes an accelerant for data exposure.

The permission inheritance problem

Traditional workflowCopilot workflow
User must navigate to contentCopilot searches entire corpus
Manual discovery of filesAI-assisted discovery
Limited by time and attentionUnlimited search capacity
Permission issues rarely exploitedPermission issues surfaced instantly

The concern is not that Copilot bypasses security controls, but that it makes existing permission sprawl visible and exploitable in ways that were previously impractical. A user who technically had access to a sensitive SharePoint site but never navigated there manually might now receive excerpts from confidential documents in a Copilot-generated summary.

The scale of the problem

MetricFindingSource
Average files accessible per employee~17 millionVaronis (2025)
Files containing sensitive data~20%Varonis (2025)
Sensitive data typesPII, financial records, IPIndustry analysis
Organizations with oversharing issues90%+Cloud Security Alliance

Research published by Varonis in late 2025 found that in a typical enterprise Microsoft 365 environment, the average employee has access to approximately 17 million files, and roughly 20% of those files contain sensitive data. Copilot’s ability to search across this entire corpus and synthesize results means that permission misconfigurations that were previously low-risk become material data exposure vectors.

CISO concerns (IANS Research December 2025)

MetricFinding
CISOs citing Copilot data access as top-5 concern62%
Organizations that delayed/restricted Copilot deployment34%
Organizations surveyed5,000+ employees

New security controls

Microsoft’s updated security framework introduces several layers of protection designed to mitigate oversharing risks without requiring organizations to overhaul their permission structures from scratch.

Purview DLP integration

Copilot for Microsoft 365 now respects Microsoft Purview DLP policies in real time:

CapabilityFunction
Prompt scanningDetects sensitive information types in user queries
Response blockingPrevents Copilot from returning responses containing sensitive data
Web search protectionBlocks sensitive data from external search queries
Pre-built agentsDLP applies to Microsoft 365 Copilot agents
Real-time enforcementPolicies evaluated at interaction time

Previously, DLP policies were enforced at the point of sharing or exfiltration (sending an email or uploading to an external service), but not at the point of AI-assisted retrieval. The new integration closes this gap.

Supported sensitive information types

DLP integration protects against exposure of:

CategoryExamples
FinancialCredit card numbers, bank accounts
Government IDsSocial Security numbers, passport identification
HealthcareHIPAA-protected information
Custom SITsOrganization-defined patterns
Regulatory dataGDPR, PCI-DSS patterns
Intellectual propertyCustom-defined confidential content

Sensitivity label enforcement

Organizations using Microsoft Purview Information Protection sensitivity labels can now configure Copilot to enforce label-based restrictions:

LabelRestriction
Highly ConfidentialCannot be surfaced to users outside designated audience
Internal OnlyHidden in summaries shared with external guests
RestrictedRequire explicit confirmation before inclusion
Custom classificationsOrganization-defined rules

Data Security Posture Management for AI (DSPM)

FeatureFunction
Weekly risk assessmentAutomatically scans top 100 SharePoint sites by usage
Custom assessmentsSupplement with organization-defined scopes
Oversharing identificationFlags potential data exposure risks
Remediation guidancePrioritized recommendations
Monitoring dashboardOngoing visibility

Oversharing Assessment Dashboard

A new Copilot Oversharing Assessment tool in the Microsoft 365 admin center provides visibility into potential data exposure:

FeatureFunction
Risk identificationSites with broad access containing sensitive content
User analysisUsers whose queries frequently surface external team content
Document trackingDocuments surfaced disproportionately relative to intended audience
Remediation guidancePrioritized recommendations by risk level
Trend analysisTrack improvement over time

SharePoint Advanced Management integration

ControlDescription
Data access governance reportsIdentify potentially overshared sites
Restricted Content DiscoveryFlag sites to hide from Copilot and Org-wide search
Permission remediationTargeted fixes for high-risk configurations
Restricted access control policyLimit site access to specific groups

Protect tab remediation options

OptionFunction
Restrict access by labelDLP policy prevents Copilot from summarizing labeled content
Restrict all itemsSharePoint Restricted Content Discovery hides sites
Per-site controlsGranular site-level protection

Required admin roles

RolePermissions
Entra AI AdminManage all Microsoft 365 Copilot and AI enterprise services
Purview Data Security AI AdminEdit DLP policies, view AI content in DSPM
Global AdministratorFull access to all controls
Compliance AdministratorDLP policy management
SharePoint AdministratorSite-level controls and RCD

How security leaders are reacting

The announcement has been generally well received, though many noted the controls arrive later than desired.

“These controls should have been available at launch. But credit to Microsoft for building them into the platform rather than expecting customers to solve the problem with third-party tools alone. The DLP integration is particularly important because it means you do not have to fix every permission issue before rolling out Copilot safely.” — Geoff Belknap, security advisor and former LinkedIn CISO

Cloud Security Alliance guidance

The CSA published guidance noting that the new controls are most effective for organizations that have already invested in data classification and labeling. For enterprises that have not yet classified their data or deployed sensitivity labels, the controls provide limited immediate benefit.

Readiness factorControl effectiveness
Mature data classificationHigh
Sensitivity labels deployedHigh
No classification programLimited
Permission governance in placeHigher
Legacy permission debtLower initial impact

Availability and licensing

FeatureRequirement
DLP integrationMicrosoft 365 E5 or Purview add-on
Sensitivity label enforcementMicrosoft 365 E5 or Purview add-on
Oversharing Assessment dashboardPublic preview (no E5 required)
Dashboard GAExpected March 2026
DSPM for AIPurview license required

Deployment guide recommendations

Microsoft’s Copilot Security Deployment Guide emphasizes a phased approach:

PhaseActionTimeline
1. AssessUse Oversharing Assessment to identify permission risksWeek 1-2
2. RemediateFix highest-risk permission misconfigurationsWeek 2-4
3. ClassifyDeploy sensitivity labels on sensitive contentWeek 3-6
4. EnforceEnable DLP and sensitivity label policiesWeek 5-8
5. MonitorUse Purview audit logs and admin dashboard ongoingContinuous

Implications beyond Microsoft

The Copilot security enhancements reflect a broader challenge facing every organization deploying AI assistants:

VendorProductSame risk?
GoogleGemini for WorkspaceYes
SalesforceEinstein CopilotYes
SlackSlack AIYes
NotionNotion AIYes
OthersEnterprise AI assistantsYes

The underlying issue—that most enterprise environments carry significant permission debt accumulated over decades of collaborative work—is not something any single vendor can solve with technology alone. Security practitioners have emphasized that AI deployment must be accompanied by data governance maturation.

Scale of impact

MetricValue
Microsoft 365 paid seats400+ million globally
Copilot for Microsoft 365 licensesTens of millions
Enterprise deploymentsThousands of organizations

The security controls announced this week will define baseline expectations for how AI assistants should handle sensitive enterprise data across the industry.

Recommendations

For organizations deploying Copilot

PriorityAction
ImmediateRun Oversharing Assessment
ImmediateReview DSPM for AI findings
HighEnable DLP policies for Copilot prompts
HighDeploy sensitivity labels on sensitive content
MediumReview SharePoint site permissions
MediumConfigure Restricted Content Discovery
OngoingMonitor Copilot access patterns in Purview

For organizations planning deployment

StepConsideration
Pre-deploymentComplete data classification project
Pre-deploymentRemediate obvious permission issues
Pre-deploymentDeploy sensitivity labels
PilotStart with security-conscious teams
PilotMonitor for unexpected data surfacing
ExpandRoll out with DLP enforcement active

Data governance prerequisites

RequirementImportance
Data classification taxonomyCritical
Sensitivity label deploymentCritical
Permission review processHigh
Site ownership accountabilityHigh
Regular access reviewsMedium

Context

The controls address a legitimate concern, but organizations should not assume enabling them eliminates oversharing risk. The fundamental work of data governance—classification, access reviews, and least privilege—remains essential.

RealityImplication
Controls are reactivePrevention requires governance
Labels must exist to workClassification investment required
Permissions still matterLeast privilege remains goal
Monitoring is ongoingNot a one-time fix

Microsoft’s release of these controls acknowledges that AI assistants fundamentally change the risk calculus for enterprise data. What was previously theoretical exposure (a user could access a file if they knew where to look) becomes practical exposure (Copilot finds and surfaces that file automatically).

Organizations that have invested in data governance are well-positioned to benefit from these controls. Those that haven’t face a choice: defer AI adoption until governance catches up, or accept elevated risk during deployment. The new tools at least provide visibility into that risk, which is more than organizations had before.