Microsoft said a software bug in its Office products allowed Copilot AI to access and summarize confidential customer emails without authorization. The issue affected Copilot Chat, an AI feature available to paying Microsoft 365 customers using applications such as Word, Excel, and PowerPoint.
According to Microsoft, the bug caused draft and sent emails labeled as confidential to be incorrectly processed by Copilot Chat, even when customers had data loss prevention policies in place. The issue, tracked internally as CW1226324, has been active since January and was first reported by Bleeping Computer. Microsoft said it began rolling out a fix earlier this month but has not disclosed how many customers were affected.
The incident highlights growing concerns about how generative AI tools handle sensitive enterprise data. AI assistants like Copilot rely on access to user content to generate summaries and insights, increasing the risk of unintended exposure when safeguards fail.
The disclosure follows a recent decision by the European Parliament to block built-in AI tools on lawmakers’ devices over concerns that confidential correspondence could be uploaded to cloud-based AI systems. The move reflects broader scrutiny of AI deployment within government and regulated environments.