Editor's Synopsis
Microsoft disclosed a bug in Microsoft 365 Copilot (tracked as CW1226324) that allowed the AI assistant to summarize confidential emails in violation of Data Loss Prevention (DLP) policies. Starting January 21, 2026, emails stored in Sent Items and Drafts folders that carried sensitivity labels were being processed by Copilot Chat without proper permission checks.
This vulnerability is significant because it undermines the trust organizations place in sensitivity labels and DLP controls to protect confidential information. Microsoft has since fixed the issue, but the incident highlights the ongoing challenges of integrating AI tools like Copilot with existing enterprise security frameworks, where a single bug can inadvertently expose sensitive data at scale.