Warning: This story contains details about suicide
On Wednesday, a federal judge rejected the arguments made by an artificial intelligence company that Chatbots are protected by the first amendment – at least at the present time.
The developers who stand behind a personality seek. AII to reject a lawsuit claiming that the company’s chat keys had pushed a teenager to kill himself. The judge’s order will allow the illegal death lawsuit to move forward, while legal experts say it is among the latest constitutional tests of artificial intelligence.
The lawsuit was filed by the mother of Florida, Megan Garcia, who claims that her 14 -year -old son Siwel Citzer, a victim of a character.
Meetali Jain of the Technical Justice Bill, one of Garcia’s lawyers, said that the judge sends a message that the Silicon Valley “needs to stop, think and impose handrails before launching products to the market.”
The lawsuit against personal techniques, the company behind the letter. It caught the attention of legal experts and artificial intelligence monitors in the United States and beyond, as technology quickly restores the workplace, markets and relationships despite the existential risks of experts.
“It is certain that it puts it as a potential test issue for some of the broader issues that involve artificial intelligence,” said Lerissa Barnette Leesky, a law professor at the University of Florida with a focus on the first amendment and artificial intelligence.
Manetoba woman speaks after receiving a phone call she said was an Amnesty International fraud to the voice of a member of his family. An expert says that the use of artificial intelligence by fraudsters is the latest in phone fraud.
The lawsuit claims that the teenager has become isolated from reality
The lawsuit claims that in the last months of his life, Setzer has become increasingly isolated from reality while participating in sexual conversations with the robot, who was engraved after a fictional character from the TV program game of thrones.
In his last moments, the robot told Setzer that he loved him and urged the teenager to “return home for me as soon as possible,” according to screenshots of the stock exchanges. Moments after receiving the message, Citzer shot himself, according to legal files.
In a statement, a FORME.AI spokesman pointed to a number of safety features implemented by the company, including handrails for children and suicide prevention resources that were announced on the day the lawsuit was filed.
“We are very interested in the safety of our users and our goal is to provide an attractive and safe space,” the statement said.
Developers’ lawyers want the case to be rejected because they say Chatbots deserves to be protecting the first modification, and the referee can have a “chilling effect” on the industry of artificial intelligence.
“Warning for parents”
On Wednesday, the American boycott judge, Anne Konway, rejected some claims of freedom to express the defendants, saying that it was “unprecedented” to keep the chat output constitute a speech “at this stage.”
Conway found that personal techniques could confirm the rights of the first amendment of its users, who found it the right to receive a “letter” of chatting.
It also decided that Garcia can move forward with allegations that Google could bear the responsibility of its alleged role in helping to develop personality. Some of the founders of the platform had previously worked to build an artificial intelligence in Google, and the lawsuit says the technology giant “realizes” technology risks.
“We are not strongly faced this decision,” said Google spokesman Jose Castanida. “Google and Farks.AI are completely separate, and did not create Google, its design, the management of the France.ai application or any part of it.”
Regardless of how the lawsuit appears, Lidsky says that the case is a warning of “the risks of assigning our emotional and mental health to artificial intelligence companies.”
She said: “It is a warning to parents that social media and artificial intelligence devices are not always harmful.”
If you or a person you know are fighting, here is the place of searching for help:
https://i.cbc.ca/1.7541031.1747929664!/cpImage/httpImage/image.jpg_gen/derivatives/16x9_1180/florida-chatbot-lawsuit.jpg?im=Resize%3D620
Source link