A practical guide to Outlook HIPAA compliance. Learn encryption requirements, configuration steps, and when to choose dedicated HIPAA email solutions.
A Microsoft 365 Copilot bug allowed the AI assistant to read confidential emails despite Data Loss Prevention policies designed to protect sensitive information.
Every time you send a text, pay for groceries with your phone, or use your health site, you are relying on encryption.
Microsoft has downplayed the issue in official communications, stating that the summaries of the confidential emails were not exposed to anyone that did not already have access to the messages in ...
Microsoft has acknowledged an error causing its AI work assistant to access and summarise some users' confidential emails by mistake. The tech giant has pushed ...
Microsoft 365 Copilot bug let AI summarize confidential emails despite Data Loss Prevention rules. What it means for your data and safety.
Apple is testing secure messaging between Android and iOS devices with iOS 26.4, iPadOS 26.4, and macOS Tahoe 26.4. The updates introduce end-to-end encryption (E2EE) for RCS messages, a security ...
Microsoft's CW1226324 advisory confirms Copilot bypassed sensitivity labels and DLP policies for four weeks. Combined with ...
A code error in Copilot Chat’s “Work” tab allowed the AI to pull emails from users’ Sent Items and Drafts folders — even when those emails carried confidentiality labels and had DLP rules explicitly ...
A code bug blew past every security label in the book… and exposed the fatal flaw in how we govern AI.
Microsoft is expanding data loss prevention (DLP) controls to block the Microsoft 365 Copilot AI assistant from processing confidential Word, Excel, and PowerPoint documents, regardless of their ...