Technology

Microsoft Copilot bug exposes confidential emails to AI

Error in Microsoft 365 Copilot Chat accidentally accessed sensitive emails from drafts and sent folders

February 21, 2026
Microsoft Copilot bug exposes confidential emails to AI
Microsoft Copilot bug exposes confidential emails to AI

Microsoft has confirmed a bug in its AI assistant, Microsoft 365 Copilot Chat, which briefly exposed confidential emails to the tool. The issue affected messages in users’ drafts and sent folders, including those labelled as confidential, although no unauthorised access occurred, the company said. The bug has now been fixed with a global configuration update.

Copilot Chat allows enterprise users to summarise emails and chats within Outlook and Teams using generative AI. A recent code issue caused the AI to process content from draft and sent emails despite sensitivity labels and data loss prevention policies, according to Microsoft. “This behaviour did not meet our intended Copilot experience,” a Microsoft spokesperson said.

Experts warn that rapid rollout of AI features increases the risk of mistakes. Gartner Data Protection and AI Governance Analyst Nader Henein said that this type of mistake is unavoidable in the current rate of new AI technology development.

University of Surrey Professor and cybersecurity expert Alan Woodward pointed out the need for private-by-default and opt-in AI systems, as data leaks are expected with the rapid development of AI tools.

Microsoft has confirmed that the bug did not result in patient data exposure or allow anyone to access information they could not view otherwise. A configuration update was rolled out globally for business users. The problem was first discovered by Bleeping Computer and affected some NHS employees in England, but no sensitive patient data was leaked.