Microsoft AI Division causes a leak of 38TB of Data
Microsoft’s AI research division inadvertently exposed 38 terabytes of sensitive data, including secret keys, passwords, and internal Microsoft Teams messages. The mishap occurred when the company published a storage bucket of open-source training data on its GitHub repository, affecting both internal personnel and potentially anyone with the knowledge to find this data.
While attempting to provide open source code and AI models for image recognition, Microsoft also accidentally gave permissions to access a much broader range of data stored on Azure Storage. According to Wiz, a cloud security startup that discovered the oversight, the URL was misconfigured to allow “full control” rather than “read-only” permissions. This error meant that anyone with access to the link could potentially manipulate, delete, or inject malicious content into Microsoft’s data storage account.
The data exposure included the personal backups of two Microsoft employees’ computers and more than 30,000 internal Teams messages involving hundreds of employees. Alongside these were passwords to Microsoft services and other secret keys. All these were made accessible due to an overly permissive shared access signature (SAS) token embedded in the Azure Storage URL.
After Wiz notified Microsoft about the security lapse on June 22, the tech giant took just two days to revoke the problematic SAS token. Microsoft has since expanded GitHub’s secret scanning service to monitor for SAS tokens with overly permissive expirations or privileges. The company emphasized that no customer data was compromised and no other internal services were affected.
Wiz CTO Ami Luttwak pointed out that as companies like Microsoft strive to develop AI solutions, they need to impose additional security checks and safeguards. Given the vast amounts of data that development teams need to manipulate and share, this incident serves as a crucial reminder to tech companies to bolster their data security measures.
This security lapse at Microsoft is not isolated. In July 2022, JUMPSEC Labs highlighted similar risks with misconfigured Azure storage accounts. And just two weeks prior to this incident, Microsoft revealed that its systems had been compromised by hackers based in China. As technology evolves, it is increasingly essential for organizations to remain vigilant about data security.
Businesses of all sizes rely on annual budgets to plan their expenses and allocate resources.…
Businesses of all sizes rely on annual budgets to plan their expenses and allocate resources.…
Investing in workplace safety is a crucial aspect of running a business. Not only does…
Building strong relationships with colleagues is essential for success in the workplace. Good relationships help…
In today's competitive business world, it's more important than ever to keep your employees motivated,…
The global economy is constantly evolving, and with each passing year, new industries emerge while…