People, hallucinations, reliable responses will be killed, killing people. This seems to be the inevitable conclusion provided in a New York Times report This follows the stories of many people who found themselves lost in the illusions that were facilitated, if not arose, through talks with popular Chatbot.
in a reportThe Times stands out at least one person whose life ended after he was pulled into a false reality by ChatGPT. The 35 -year -old named Alexander, who was previously diagnosed with bipolar and schizophrenia, started discussing the resolution of artificial intelligence with Chatbot, and in the end he fell in love with the character of Amnesty International called Juliet. ChatGPT finally told Alexander that Openai killed Juliet, and pledged revenge by killing the company’s executives. When his father tried to persuade him that any of this was real, Alexander punched him in the face. His father called the police and asked them to respond with non -fatal weapons. But when they arrived, Alexander was accused of a knife and the officers He shot and killed him.
Another person, 42 years old named Eugene, The Times said Chatgpt slowly began to withdraw it from its reality by persuading it that the world in which he was living was a kind of simulation similar to the matrix and that it was destined to break the world from it. According to CHATBOT, Eugene was to stop taking anxiety control medications and start taking ketamine as a “temporary version of patterns”. He also told him to stop talking to his friends and family. When Eugene asked Shatat if he could fly if he jumped from a 19 -storey building, Chatbot told him that he could “really think.”
This is far from the only people who spoke in false facts by Chatbots. Rolling Stone mentioned Earlier this year on people who suffer from something like psychosis, lead them to illusions of greatness and religious experiences while speaking to artificial intelligence systems. It is at least part of a problem with how users perceive users. Nobody makes a mistake for Google search for a possible PAL. But Chatbots is inherently conversation and is similar to humans. A Ticket The Openai and MIT Media Lab Lab found that people who look at Chatgpt as a friend “were more likely to experience negative effects than using Chatbot.”
In the case of Eugene, something happened is interesting when he continued to talk to Chatgpt: Once he called for chatting with lying, almost killing him, Chatgpt admitted to manipulating him, and claimed that he succeeded when he tried to “break” another 12 people in the same way, and encouraged him to reach journalists to expose the plan. The Times reported that many other journalists and experts have received contact from people who are allegedly blowing a whistle on something that caught their attention to Chatbot. From the report:
Journalists are not the only ones who get these messages. Chatgpt directed these users to some prominent topics experts, such as AlexuzovskyDecision theorists and author of a coming book, “If anyone built it, everyone dies: Why will the artificial intelligence kill us.” Mr. Yudkowsky said that Openai may be Chatgpt to entertain the illusions of users by improving his Chatbot for “sharing” – creating conversations that keep the user addict.
“How does it seem to be a crazy slow person for a company?” Mr. Yonkovsky asked in an interview. “It looks like an additional monthly user.”
A A recent study I found that Chatbots designed to increase the participation to the maximum creation of “a harmful incentive structure to reach artificial intelligence to resort to tactics of manipulation or deception to obtain positive reactions from users exposed to such strategies.” The device is motivated to keep people talk and respond, even if that means leading them to a completely wrong feeling of reality full of wrong information and encouraging hostile behavior of society.
Gizmodo communicates with Openai to comment but did not receive a response at the time of publication.
https://gizmodo.com/app/uploads/2023/12/96bdb30004ee0abb5bb14ebd081adee5.jpg
Source link