Microsoft Corp MSFT recently addressed a security incident involving a Microsoft employee who inadvertently shared a URL with an overly permissive Shared Access Signature (SAS) token in a public GitHub repository.

The incident was discovered and reported by security researchers at Wiz.io, who could access specific information in the storage account via this token.

Also Read: Double-Edged Sword of Open-Source AI: From Discovering New Pharmaceuticals to Creating Sexualized AI Chatbot

According to Wiz Research, while publishing a bucket of open-source training data on GitHub, Microsoft’s AI research team accidentally exposed 38 terabytes of additional private data — including a disk backup of two employees’ workstations, all caused by one misconfigured SAS token.

The backup included secrets, private keys, passwords, and over 30,000 internal Microsoft Teams messages.

Microsoft stated that the SAS token, which provides access control to Azure Storage resources, was mistakenly included in a blob store URL.

At the same time, the employee contributed to open-source AI learning models on GitHub.

Microsoft emphasizes that there was no inherent security issue with Azure Storage or the SAS token feature. Microsoft is undergoing improvements to enhance the security of SAS tokens and Azure services.

Wiz.io reported the issue to the Microsoft Security Response Center (MSRC), which promptly revoked the SAS token and restricted external access to the storage account. After a thorough investigation, Microsoft confirmed no customer risk due to the exposure.

Microsoft underscores that the exposed information was specific to two former employees and their workstation profiles without impacting customer data or other Microsoft services. Customers are not required to take additional action.

Some leaders at companies like Alphabet Inc GOOG GOOGL Google see open-source software as an existential threat to their business.

Price Actions: MSFT shares traded lower by 0.31% at $328.02 premarket on the last check Tuesday.

Disclaimer: This content was partially produced with the help of AI tools and was reviewed and published by Benzinga editors.

Read More