Even if Chatbots succeeds in passing the Turning test, they will have to give up the game if they work in California. New bill Proposal By Senator in California, Steve Padilla will require Chatbots that interact with children to make accidental reminders that, in fact, a machine and not a real person.
The bill, SB 243It was presented as part of an attempt to regulate the guarantees that operating companies must put in place in order to protect children. Among the requirements established by the draft law: It will prevent companies from “providing rewards” for users to increase the sermon or use, and he asked companies to provide reports to the Ministry of Health Care to the state, the number of times the number of minors is offered signs of suicide thinking, and the periodic reminders that Chatbots are what is done Its creation of artificial intelligence, not humanitarian.
This last thing is particularly closely close to the current moment, as it was found that children are subject to these systems. Last year, he is 14 years old His life was tragicly taken After developing an emotional connection with Chatbot that can be accessed by the letter. AII, a service to create Chatbots similar to different pop culture. The child’s parents have A lawsuit against the character During the death, he accused the platform being “unreasonable dangerous” and without sufficient safety handrails despite its marketing for children.
Researchers at Cambridge University Find Children are more likely than adults who look at AI Chatbots as trustworthy, so that they look at them as almost human. Children can expose a great danger when Chatbots responds to putting them without any kind of protection in place. This is what, for example, the researchers enabled the integrated intelligence intelligence in Snapchat Provide instructions to a 13 -year -old virtual user On how to lie to her parents to meet a 30 -year -old child and lose her virginity.
there Possible benefits Children who feel free to share their feelings with a robot if they are allowed to express themselves in a place where they feel safe. But the risk of isolation is real. There may be useful small reminder that there is no person on the other side of your conversation, and it interferes in the addiction cycle so that the technology platforms are very ingenious in besieging children through the repeated Hit the dopamine It is a good starting point. The failure to provide these types of interventions when social media began to seize them as part of how we got here in the first place.
But this protection will not address the root problems that lead to children’s request to obtain Chatbots in the first place. There is a severe shortage of resources available to facilitate realistic relationships for children. The classroom is exaggerated and Lack of financingand After school programs decrease“Third places“Continue to disappear, and there is a Lack of psychologists for children To help children treat everything they deal with. It is good to remind children that Chatbots are not real, but it would be better to put them in situations where they do not feel that they need to speak to robots in the first place.
https://gizmodo.com/app/uploads/2023/06/e1470962175738dcf65d55e505edecd9.jpg
Source link