Contents:
Microsoft researchers leaked 38TB of sensitive data to a public GitHub repository while training open-source AI learning models. The Microsoft data leakage occurred starting July 2020 and white hat hackers only discovered and reported it on June 22nd, 2023.
Consequently, Microsoft issued an advisory saying:
No customer data was exposed, and no other Microsoft services were put at risk because of this issue. Customers do not need to take any additional action to remain secure.
What Caused the 38TB Microsoft Data Leakage
In their advisory, Microsoft stated that human error caused the data leakage. An employee shared a URL for a misconfigured Azure Blob storage bucket in a public GitHub repository. The URL included a Shared Access Signature (SAS) token for the internal storage account. This offered access to the stored data to whoever clicked the link.
The incident happened while a researcher at Microsoft worked on an open-source AI learning model. Microsoft found no vulnerability within Azure Storage or the SAS token feature.
Shared Access Signatures (SAS) enables delegating access safely to data within a storage account. As SAS URLs permit access to data, they should be treated as an application secret. Applying the principle of least privilege (POLP), you should only share it with users who need access to that specific storage account.
Passwords and Secret Keys Among the Leaked Data
BleepingComputer said the 38TB of compromised data included passwords for Microsoft services and secret keys. 30,000 internal Microsoft Teams messages written by 359 Microsoft employees also were exposed.
According to the software giant the exposed data only included backups of two former employees’ workstation profiles and Microsoft Teams conversations.
Microsoft insisted that none of their customer data was leaked, and no other internal services was at risk due to the incident.
Data exposed in this storage account included backups of two former employees’ workstation profiles and internal Microsoft Teams messages of these two employees with their colleagues.
Safety Measures to Prevent Data Leakage
Verizon`s 2023 Data Breach Investigations Report pointed out that 74% of all breaches include the human element. Human error, alert fatigue, an overcrowded task list can lead to security incidents that put data and assets at risk.
Proper trainings are mandatory to prevent the possibility of a cyberattack or another incident to happen. However, human error will still remain a risk factor. So, in order to avoid data leakage and unauthorized access to important assets, you need to follow certain cybersecurity best practices. Dragoș Roșioru, XDR Team Lead Support at Heimdal, recommends:
- Enforce Multi-Factor Authentication.
- Enable Azure Monitor and Azure Storage Logs.
- Apply the principle of least privilege.
- Keep track of assets and privileges and regularly audit permissions.
- Encrypt all sensitive data.
- Follow best practices when using SAS tokens. Use them for a specific purpose only and set an expiration date.
If you liked this article, follow us on LinkedIn, Twitter, Facebook, and Youtube, for more cybersecurity news and topics.