Contents
Microsoft 365 Copilot Chat Bug Exposes Confidential Emails
Microsoft has confirmed a significant security flaw within its 365 Copilot Chat feature. This bug allowed the artificial intelligence assistant to access, analyze, and summarize emails marked as confidential, leading to unintended exposure of sensitive information for some users. This discovery has raised concerns among both individuals and organizations relying on Microsoft’s AI tools for productivity.
The Unintended Reach of AI: Confidential Emails Compromised
Reports from users and administrators highlighted that Copilot was able to process messages located in “Sent” and “Drafts” folders. Critically, this included emails explicitly designated as confidential. The issue was present even in organizations that had robust Data Loss Prevention (DLP) policies enabled. DLP mechanisms are specifically designed to prevent sensitive data from being processed or leaked by AI tools and other systems, making this oversight particularly concerning.
Identified under the code CW1226324, this bug had reportedly been active for several weeks, meaning an unknown number of users may have unknowingly shared restricted content with the AI.
Microsoft Acknowledges the Flaw and Deploys Fixes
Responding to the widespread reports, Microsoft officially confirmed the existence of the vulnerability. The technology giant attributed the problem to a likely coding error and announced that work on a patch was underway. Initial updates are already being rolled out to a select group of customers.
While Microsoft has not yet disclosed the exact number of affected customers or a definitive timeline for a full rollout of the fix to all users, the company has advised administrators to closely monitor their Microsoft 365 admin center. They should observe Copilot’s activity to ensure it no longer accesses content that requires protection. This proactive monitoring is crucial until a universal patch is fully implemented.
Understanding Microsoft 365 Copilot and Data Loss Prevention (DLP)
For those unfamiliar, Microsoft 365 Copilot is an AI-powered assistant integrated across Microsoft 365 applications like Outlook, Word, Excel, and Teams. It aims to boost productivity by automating tasks, summarizing information, and assisting with content creation.
Data Loss Prevention (DLP) refers to a set of tools and processes designed to ensure that sensitive data is not lost, misused, or accessed by unauthorized users. In an enterprise context, DLP policies are critical for protecting intellectual property, financial data, and personal identifiable information (PII) by preventing its transfer, use, or access in ways that violate organizational security policies. The failure of Copilot to adhere to these policies, even when activated, represents a significant security loophole.
This incident underscores the ongoing challenges and responsibilities associated with integrating powerful AI tools into enterprise environments, particularly regarding data privacy and security. Organizations are continually balancing the benefits of AI-driven productivity with the imperative to safeguard sensitive information.
Frequently Asked Questions (FAQ)
What was the bug in Microsoft 365 Copilot Chat?
A bug in Microsoft 365 Copilot Chat allowed the AI assistant to analyze and summarize emails, including those marked as confidential, even when Data Loss Prevention (DLP) policies were enabled.
Which emails were affected by the bug?
The bug primarily affected emails located in “Sent” and “Drafts” folders, specifically those designated as confidential.
Even with DLP policies, did the bug occur?
Yes, the bug was reported to occur even in organizations where Data Loss Prevention (DLP) policies were actively configured to prevent the processing of sensitive data by AI tools.
What is Microsoft doing to fix this issue?
Microsoft has confirmed the bug, attributed it to a likely coding error, and is actively working on a fix. Initial updates are already being rolled out to a limited group of customers.
What should users or administrators do now?
Administrators are advised to monitor their Microsoft 365 admin center to observe Copilot’s activity and ensure it is not accessing protected content. Users should remain vigilant about the types of information they process with AI tools.

