the Federal Trade Committee It launches an investigation in AI Chatbots from seven companies, including Alphabet, Meta and Openai, to use it as an drifling. The investigation includes finding how to test, monitor and measure companies Possible damage to children and adolescents.
The media with a healthy sense reconnaissance Of the 1060 teenagers in April and May, it was found that more than 70 % used artificial intelligence comrades and it More than 50 % use it constantly – Several times or more per month.
Experts have warned for some time that exposure to a bot bot may be harmful to young people. A study revealed that ChatGPT I gave a bad advice to teenagersLike how to hide eating disorder or customize suicide notes. In some cases, at Chatbots Ignoring comments It should have been recognized that it was anxious, moving around the comment to continue the previous conversation. Psychologists call for the presence of handrails to protect youth, such as reminding the chat that Chatbot is not a human being and that teachers must define illiteracy priorities from artificial intelligence in schools.
It is not just children and adolescents, though. There are many adults who have suffered negative consequences Dependence on Chatbots – Whether for companionship, advice, or a personal search engine on facts and reliable sources. Often chatbots Say what you think you want to hear itWhich can lead to apartment lies. And follow the instructions of Chatbot blindly not always The right thing to do.
Andrew n said. Ferguson, Chairman of the FTC Board in a statement “With the development of artificial intelligence techniques, it is important to consider the effects of Chatbots on children.” “The study that we launch today will help us better understand how artificial intelligence companies develop their products and the steps they take to protect children.”
A Craft CNET spokesperson told every service to have a prominent evacuation that all the chat should be treated as a fantasy.
The spokesman said: “Last year, we have launched many objective safety features, including a completely new experience under 18 years and the advantage of parents’ visions,” the spokesman said.
The company, behind the Snapchat social network, said that it had taken steps to reduce risks. “Since the introduction of my artificial intelligence, Snap surrendered its safety and strict privacy to create a product not only useful to our society, but also transparent and clear about its capabilities and restrictions,” said a spokesman for “Since my artificial intelligence, Snap surrendered its safety and strict privacy to create a product not only useful to our society, but also transparent and clear about its capabilities and restrictions.”
META refused to comment, FTC or any of the four remaining companies immediately responded to our request to comment.
FTC has Issued orders It is seeking a remote conference with the seven companies on the timing and coordination of their offers no later than September 25. Investigating companies include makers of some of the world’s largest chat tools or famous social networks that include artificial intelligence:
- The alphabet (the parent company of Google)
- Personal techniques
- Definition platforms;
- Openai
- pop
- X.i.
Starting late last year, some of these companies have updated or enhanced their protection features for younger individuals. The letter started Boundary On how Chatbots respond to people under the age of 17 and added parents’ controls. Instagram foot Adolescent accounts Last year, all users under the age of 17 and recently turned Limits on topics Adolescents can be with Chatbots.
FTC seeks to get information from the Seven Companies on how:
- Invest the user sharing
- User input processing and creating outputs in response to user inquiries
- Development and approval of personalities
- Measuring, testing and monitoring negative effects before and after publication
- Reducing negative effects, especially for children
- Use disclosures, ads and other representations to inform users and parents of the features, capabilities, intended audiences, possible negative effects, data collection and practice of practices
- Monitoring and imposing compliance with the company’s rules and conditions of services (for example, community guidelines and age restrictions) and
- Use or share the personal information obtained through user conversations with Chatbots
https://www.cnet.com/a/img/resize/2c668421ba0cd8992fb771881151ed9a800c68a8/hub/2025/08/01/4ce09f44-df8b-442c-9add-fa1423f7860f/gettyimages-2217666425.jpg?auto=webp&fit=crop&height=675&width=1200
Source link