Exposed: Microsoft's AI Training Data Uncovered

2023-09-20 22:11:02

HyperSpeed Unleashes Pro Gaming Audio with the Razer BlackShark V2 on a Budget

Microsoft is currently facing a major data leak resulting from a mistake made by its AI researchers. During the process of uploading training data to allow other researchers to train AI models for image recognition, the sensitive data was inadvertently exposed. The leaked information included a variety of highly confidential data such as secrets, private keys, passwords, and even over 30,000 internal Microsoft Teams messages. Cloud security company Wiz was the first to discover this data breach.

Leaked renders unveil design of Samsung Galaxy A15
Table
  1. Microsoft's AI Team Accidentally Exposes 38 TB of Private Data through SAS Token
  2. Rechargevodafone.co.uk News of the Week
    1. Ensuring Data Security Amid the AI Revolution

Microsoft's AI Team Accidentally Exposes 38 TB of Private Data through SAS Token

Microsoft AI data leak

CEO's Innovative Approach to Renewable Energy TransitionCEO's Innovative Approach to Renewable Energy Transition

Rechargevodafone.co.uk News of the Week

Microsoft has confirmed that the exposed information consisted of data specific to two former employees and their workstations, ensuring that no customer data or other Microsoft services were compromised. The company maintains that customers do not need to take further action to maintain security. It emphasizes the importance of handling SAS tokens appropriately to minimize the risk of unintended access or misuse.

Introducing the Huawei Watch Ultimate Gold Edition: A Luxurious Experience

Ensuring Data Security Amid the AI Revolution

Microsoft AI data leak

Kyivstar's CTO Calls for Immediate Action on Network Weaknesses

As highlighted by Wiz, incidents like this may become more prevalent with the increasing prominence of AI. Companies face new risks as they leverage the power of AI on a wider scale. More engineers will work with extensive amounts of training data, necessitating additional security checks and measures.

Currently, SAS tokens pose a significant security risk as they allow information to be shared with unidentified external parties. To address this, companies should adopt Service SAS with Stored Access Policy to enhance security. This feature connects the SAS token to a server-side policy, enabling centralized management and the ability to revoke policies. Standard Account SAS tokens lack governance and security, making it easy for token creation errors to go unnoticed, potentially exposing vast amounts of data.

It is hoped that Microsoft will learn from this incident and implement improved security measures to prevent future data leaks. Other companies working with AI must also prioritize data security to prevent sensitive information from falling into the wrong hands.

Thank you for reading this article. For more news and updates, check out our Home page. If you'd like to receive similar news, please subscribe to our Telegram channel: https://t.me/LifeRecharged.

If you would like to know other articles similar to Exposed: Microsoft's AI Training Data Uncovered updated this year 2024 you can visit the category Breaking Tech News.

Leave a Reply

Your email address will not be published. Required fields are marked *

Go up