People like to talk to artificial intelligence –Some, very little. According to the contract workers for DeadThose who review people’s interactions with the company’s chatbots to improve their artificial intelligence, people are a little prepared to share personal and private information, including their real names, phone numbers and email addresses, with Meta’s AI.
Business Insider Talk with Four contract workers are rented through Alignment and Aquarium owned on behalf of thisThe two human auditors recruitment platforms to help train artificial intelligence, and the contractors said, “The non -stored personal data was more common for the definition projects they worked on” compared to similar projects to other clients in the Silicon Valley. According to these contractors, many users on various Meta platforms such as Facebook and Instagram were sharing very personal details. Users were talking to Amnesty International in Meta as they were talking to friends, or even romantic partners, sending selfies and even “frank photos”.
To be clear, People approach Chatbots to artificial intelligence Good documented, practicing Meta- Use Human contractors to assess the quality of assistants who work from artificial intelligence In order to improve future reactions – it is hardly new. Once again in 2019, Guardian Kind how to hear Apple contractors regularly very sensitive information Siri users although the company did not have “specific procedures to deal with sensitive records” at that time. Similarly, Bloomberg I mentioned how Amazon had thousands of employees and contractors All over the world, manually review and clips from Alexa users. Deputy and Mother’s painting Microsoft’s microsoft contractors who record and review audio content were also reported, although this means that contractors will often hear children’s voices through accidental activation on their Xbox control panels.
But Meta is a different story, especially given its busy record over the past decade when it comes to relying on third -party contractors and calming the company in data governance.
Record the volatile meta on the user’s privacy
In 2018, New York Times and Guardian How to exploit the Cambridge Analytica, a political consulting group owned by the Republican hedge, on Facebook to harvest data from tens of millions of users without their consent, and this data was used to form it in the United States for voters and target them with the political ADS designated to help election President Donald Trump in 2016. led to Facebook got a fine of $ 5 billion from the Federal Trade Committee (FTC), One of the largest private settlements in the history of the United States.
The Cambridge Analytica scandal has revealed broader problems with a Facebook developer platform, which allowed access to wide data, but has limited supervision. According to the internal documents issued by Francis HoganIn 2021, the informed reporters revealed that a dead leadership often gave the priority of growth and participation in privacy or safety concerns.
Meta has also faced auditing on its use of contractors: in 2019, Bloomberg Inform how to pay facebook to contractors to copy the audio chats for users Without knowing how it was obtained in the first place. (At that time, Facebook said that the recordings not only came from users who chose copying services, adding that they also “stop” this practice.)
Facebook spent years in an attempt to rehabilitate his image: It was renamed to Meta in October 2021, and the name change is framing As an aspirational shift in focusing on “metaverse” instead of responding to the controversy surrounding the wrong information, privacy and the safety of the platform. But a dead legacy in dealing with data casts a long shadow. While the use of human auditors to improve LLMS models (LLMS) is the practice of a common industry at this stage, the latest report on Meta’s use of contractors, and information contractors say they are able to see it, raises new questions about how to deal with data by the parent company of the most popular social networks in the world.
In a statement to luckA Meta spokesman said the company has “strict policies that govern access to personal data for all employees and contractors.”
The spokesman said: “While we are working with contractors to help improve the quality of training data, we intentionally restrict the personal information they see, and we have operations and Drabil in force in teaching them how to deal with any of this information they may face.”
“With regard to projects that focus on allocating artificial intelligence … contractors are allowed in the context of their work to access some personal information in accordance with the privacy policies available to the public and the conditions of artificial intelligence. Regardless of the project, any participation or misuse unauthorized of personal information is a violation of our data policies and we will take appropriate measures.”
https://fortune.com/img-assets/wp-content/uploads/2025/08/GettyImages-2170596204-e1754500442491.jpg?resize=1200,600
Source link