Get the latest tech news

Microsoft Copilot Studio Exploit Leaks Sensitive Cloud Data


A server-side request forgery (SSRF) bug in Microsoft's tool for creating custom AI chatbots potentially exposed info across multiple tenants within cloud environments.

Researchers have exploited a vulnerability in Microsoft's Copilot Studio tool allowing them to make external HTTP requests that can access sensitive information regarding internal services within a cloud environment — with potential impact across multiple tenants. Microsoft responded quickly to Tenable's notification of the flaw, and it has since been fully mitigated, with no action required on the part of Copilot Studio users, the company said in its security advisory. Indeed, the existence of the SSRF flaw should be a cautionary tale for users of Copilot Studio of the potential for attackers to abuse its HTTP-request feature to elevate their access to cloud data and resources.

Get the Android app

Or read this on r/technology

Read more on:

Photo of microsoft copilot

microsoft copilot

Related news:

News photo

Companies ground Microsoft Copilot over data governance concerns

News photo

Top companies ground Microsoft Copilot over data governance concerns

News photo

Microsoft Copilot: Everything you need to know about Microsoft’s AI