Meta ditches fact-checkers before Trump’s second term

Photo of author

By [email protected]


Meta announced Tuesday that it will ditch third-party validation programs on Facebook, Instagram, and Threads and replace its army of paid moderators with a Community Notes model that mimics Model X. A volunteer program that has been severely damagedwhich allows users to publicly report content they believe is incorrect or misleading.

in Blog post Announcing the news, Joel Kaplan, Meta’s newly appointed chief global affairs officer, said the decision was made to allow more topics to be discussed openly on the company’s platforms. The change will first affect the company’s US moderation.

“We will allow more expression by lifting restrictions on some topics that are part of mainstream discourse and focusing our enforcement on illegal, high-risk violations,” Kaplan said, though he did not provide details on what topics these new rules would cover.

In a video accompanying the blog post, Meta CEO Mark Zuckerberg said the new policies will see more political content return to people’s feeds as well as posts about other issues that have sparked the culture wars in the US in recent years.

“We will work to simplify our content policies and get rid of a range of restrictions on topics like immigration and gender that are far from mainstream discourse,” Zuckerberg said.

Meta has It has declined significantly Fact-checking and eliminating content moderation policies it put in place following revelations in 2016 of influence operations conducted on its platforms, which were designed to influence elections and, in some cases, encourage violence and violence. Even genocide.

Before last year’s high-profile elections across the world, he was dead Criticized for taking a laissez-faire approach To moderate content related to those votes.

Echoing comments made by Mark Zuckerberg last yearKaplan said Meta’s content moderation policies were not put in place to protect users but “partly in response to social and political pressures to moderate content.”

Kaplan also criticized fact-checking experts for their “biases and viewpoints” that led to over-moderation, with Kaplan writing: “Over time, we end up fact-checking too much content that people might understand as legitimate political speech and debate.” .

However, WIRED magazine reported last year that dangerous content e.g Incorrect medical information She thrived on the platform while loving the groups Anti-government militias have used Facebook To recruit new members.

Meanwhile, Zuckerberg blamed “legacy media” for forcing Facebook to implement content moderation policies in the wake of the 2016 election. “After Trump was first elected in 2016, legacy media wrote nonstop about how threatening information is,” Zuckerberg said. Misleading to democracy. “We tried, in good faith, to address these concerns without becoming arbiters of truth, but the fact-checkers were too politically biased and destroyed more trust than they created.”

In what he tried to frame as an effort to remove bias, Zuckerberg said Meta’s internal trust and safety team would move from California to Texas, which is now also home to “The confidence to do this work in places where there is less concern about our teams being biased.”



https://media.wired.com/photos/677d311a308109df689ea176/191:100/w_1280,c_limit/1349800127

Source link

Leave a Comment