The Microsoft AI research division accidentally leaked dozens of terabytes of sensitive data starting in July 2020 while contributing open-source AI learning models to a public GitHub repository.
Unfortunately nobody in charge has seen consequences for their decision to save a few theoretical nickels, so far. But then again, a lot of software/IT related stuff would look completely different, if anybody did.
Yeah, with the GDPR, you could theoretically get sued for using inappropriate technologies, but unless a proper expert committee officially declares Azure et al unsalvagable, you can always say, you thought you were using safe technologies.
Unfortunately nobody in charge has seen consequences for their decision to save a few theoretical nickels, so far. But then again, a lot of software/IT related stuff would look completely different, if anybody did.
Yeah, with the GDPR, you could theoretically get sued for using inappropriate technologies, but unless a proper expert committee officially declares Azure et al unsalvagable, you can always say, you thought you were using safe technologies.