Meta, the parent company of Instagram, apologized on Thursday for Violent graphic content Watch some users on Instagram rollers fodder. Meta attributed the problem to a mistake that the company says was addressed.
A Meta spokesman said in a CNET statement: “We have made a mistake that caused some users to see their Instagram rollers that could not be recommended,” a Meta spokesman said in a CNET statement. “We apologize for the error.”
Meta went on to say that the accident was a mistake that had nothing to do with any changes in the company’s content policy. At the beginning of the year, Instagram made some Major changes to user policies and creationBut these changes have not specifically dealt with filtering content or inappropriate content that appear on summaries.
Meta made her own Content modification changes Recently, he has Dismantling the fact -examination department In favor of the moderation that society moves. Amnesty International Beware earlier this month Meta changes can raise the risk of fueling violence.
Read more: The report says that Instagram may revolve around rollers as an independent application.
Meta says that most of the annoying or annoying photos are announced It was removed and replaced by a warning mark Users must click to view photos. Meta says that some content is also nominated for those under the age of 18. The company says it is developing its policies on violent and official images with the help of international experts and that improving these policies is an ongoing process.
Publishing users On social media And on the letters of the messages, Including RaditAbout some unwanted pictures that they saw on Instagram, due to a defect. They included shooting, beheadings, people who are in cars, and other violent actions.
Brook Irene DoveA social media researcher and associate professor at Cornell University said that she is not convinced of dead allegations that the violent content issue was not related to politics changes.
Duffy told CNET: “Moderation systems – whether backed by artificial intelligence or human action – never fail,” Dofi told CNET. “While many speculated that Meta’s reform (which was announced last month) will create increasing risks and employees, yesterday” defect “provided direct evidence of the costs of the less estimated platform.”
Duffy added that although moderate social media platforms are difficult, “the instructions that belong to the platforms were safety mechanisms for users, especially those in marginalized societies. Meta’s replacement of its current system with the” community notes “feature is a step back in terms of protecting the user.”
https://www.cnet.com/a/img/resize/18b4335bc5de2ee07a00920f134ba7b7e788701e/hub/2025/02/27/0f728c4a-bee1-4ac7-9b67-f6e4b2338366/instagram-reels-gettyimages-1227934463.jpg?auto=webp&fit=crop&height=675&width=1200
Source link