“Artificial intelligence is expensive. Let’s be honest in it,” says Anand.
Growth for safety
In October 2024, a teenage mother who died due to suicide a The wrong death suit Against personality techniques, its founders, Google and the alphabet claimed that the company targeted its son with “anthropological experiences, double sex, and frightening realism, while Chatbot to distort itself as a real person, a psychological license, and an adult lover.” At that time, a letter spokesperson. CNBC said The company was “sad due to the tragic loss” and took “the safety of our users seriously.”
Placing the personal tragic accident. Earlier this year, US Senate Alex Badilla and Peter Walsh books A message to many companions of artificial intelligence, including the character. AI, highlighting concerns about “mental health risks and safety offered to young users” from platforms.
“The team was taking this very responsibly for nearly a year,” Anand told me. “Amnesty International is random, it is always difficult to understand what will happen. So it’s not a single investment.”
This is very important because the letter. Air grows. Upon starting the start of 20 million active users, they spend, on average, 75 minutes a day, talking to a “character” in the image of Character.ai). The company’s user base is 55 percent of females. More than 50 percent of its users are Gen Z or Gen Alpha. With this growth, the real risks – what Anand is doing to maintain the integrity of his users?
Anand says: “(in) the past six months, we have invested an inconsistent amount of resources in the ability to serve under the age of 18 differently from more than 18 years, which was not the case last year.” “I can’t say,” Oh, I can slap 18+ on my application and say its use of NSFW. “You end up creating a completely different application and a different platform on a small scale.”
Anand tells me that more than 10 company employees, who number 70 employees, are full -time confidence and safety. They are responsible for building guarantees such as age verification, separate models for users under the age of 18, and new features such as Parents’ visionsWhich allows parents to see how teenagers use the application.
The model below 18 was launched last December. It includes “a narrower group of searching characters on the platform,” according to the spokeswoman for the company, Catherine Kelly. “The filters have been applied to this group to remove letters related to sensitive or mature topics.”
But Anand says that the integrity of artificial intelligence will take more than just artistic adjustments. “Make this platform safe is a partnership between the organizers, the United States and the fathers,” says Anand. This makes watching his daughter talking to a very important character. “This should remain safe for her.”
Beyond companionship
The artificial intelligence market is prosperous. Consumers all over the world spent $ 68 million on the company of artificial intelligence in the first half of this year, an increase of 200 percent over last year, according to estimate Chenbc was martyred. The startups of artificial intelligence are launched in a slice of the market: Xai Absolute Creeping companion, investigator in July, and even Microsoft Bills Chatbot Copilot has artificial intelligence companion.
How can the letter emerge in a crowded market? He takes himself completely from him.
https://media.wired.com/photos/689a4b52c59e0b563e137fdd/191:100/w_1280,c_limit/Model-Behavior-Karandeep-Anand-Business.jpg
Source link