The Microsoft AI research division accidentally leaked dozens of terabytes of sensitive data starting in July 2020 while contributing open-source AI learning models to a public GitHub repository.
The exposed data included backups of personal information belonging to Microsoft employees, including passwords for Microsoft services, secret keys, and an archive of over 30,000 internal Microsoft Teams messages originating from 359 Microsoft employees.
In an advisory on Monday by the Microsoft Security Response Center (MSRC) team, Microsoft said that no customer data was exposed, and no other internal services faced jeopardy due to this incident.
Can, but shouldn’t. I have a work related Teams account, and one where I tried to rent a Windows VM for a consulting job. That’s it though - no private data to get leaked. The work conversations would suck though, but I’ll happily remind my boss et al why using Teams is a shitty idea in the first place.
Wait, they stored passwords in plain text?
Possibly or as a weak hash
Always have done so.
🧑🚀🔫
This is like the evolution of the “loss” meme. Gave me a chuckle.
Sure, we’ll just take your word for it, buddies. Cheers. /laughs in Linux
You can use Linux and still have a Microsoft account.
Can, but shouldn’t. I have a work related Teams account, and one where I tried to rent a Windows VM for a consulting job. That’s it though - no private data to get leaked. The work conversations would suck though, but I’ll happily remind my boss et al why using Teams is a shitty idea in the first place.
Microsoft owns GitHub. The blast radius for this could be severe.
Yeah, but the naivety of people believing in secure clouds needs to die. So if this helps, I’m all for it.