Two current and two former Meta employees have revealed documents to Congress, who are suppressed that the company may have suppressed research on the safety of children, according to a report From the Washington Post.
According to their claims, META changed its policies on searching for sensitive topics – such as politics, children, sex, race, and harassment – six weeks after Francis Hogan leaked internal documents that showed how Meta’s research found that Instagram could harm mental health of teenage girls. This revelation, which was announced in 2021, began years of Congress hearings about the child’s safety on the Internet, which is an issue It is still a hot topic In global governments today.
As part of these political changes, the report says, META suggested two ways that researchers can reduce the risk of sensitive research. One of the suggestions was shaving the lawyers in their research, and protecting their contacts from “harmful parties” due to the privilege of the client lawyer. Researchers can also write about their results more mysteriously, and avoid terms such as “incompatible” or “illegal”.
Jason Satzin, a former Mita researcher who specializes in virtual reality, told the Washington Post that his boss made him delete records of an interview in which he claimed that his 10 -year -old brother had been sexually suggested on the Meta VR platform, Horizon Worlds.
A Meta spokesman told Techcrunch: “The global privacy regulations show that if information is collected from minors under the age of 13 years without the approval of the parents or guardian who can be verified, then it must be deleted,” a Meta spokesman told Techcrunch.
But informants claim that the documents they submitted to Congress show a pattern of employees who are frustrated with discussion and research their concerns about how children under the age of 13 years of virtual reality applications in Meta.
“These few examples are placed together to suit a specific and false narration; in fact, since the beginning of 2022, Meta has agreed to nearly 180 studies related to the reality laboratory on social issues, including the safety of youth and luxury.”
TECHRUNCH event
San Francisco
|
27-29 October, 2025
In a lawsuit filed in February, Kelly Stonik – a 15 -year -old dead employee – has raised similar concerns of these four informants. she Teccrunch said Earlier this year, the “Go-To Market” strategies led to bringing Horizon worlds to teenagers, international markets and mobile users, but she felt that the application had no ways to maintain users under the age of 13 years; She also explained that the application has continuous problems with racism.
“The leadership team was aware that in one of the tests, it took 34 seconds on average to enter the platform before users who suffer from black deities were called racist havens, including” N-Word “and” Monkey “, as the case claims.
Stoniks sued dead separately due to alleged sexual harassment and gender discrimination.
While these irregularities’ allegations focus on VR products in Meta, the company also faces criticism on how other products, such as AI Chatbots, are influenced by minors. Reuters I mentioned Last month, the Meta rules of artificial intelligence previously allowed “romantic or sensory” talks with children.
https://techcrunch.com/wp-content/uploads/2025/07/GettyImages-2194278734.jpg?w=1024
Source link