Microsoft’s AI research team accidentally exposed 38 terabytes of private data through a Shared Access Signature (SAS) link it published on a GitHub repository, according to a report by Wiz research ...
The Microsoft AI research division accidentally leaked dozens of terabytes of sensitive data starting in July 2020 while contributing open-source AI learning models to a public GitHub repository.
Microsoft has learned an important lesson after having to clean up a major data leak resulting from an “overly permissive” shared access signature (SAS) token accidentally disclosed by one of its ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results