The researchers made a platform for social media as each user was Amnesty International. The robots ended in the war

Photo of author

By [email protected]


Social platforms such as Facebook and X exacerbate the problem of political and social polarization, but they do not create them. A A recent study Researchers at the University of Amsterdam in the Netherlands have conducted Amnesty International in the simple social media structure to find out how they interacted with each other and found that, even without the hand of the invisible algorithm, they tend to organize themselves based on their pre -determined affiliation and connecting them to the echo rooms.

The study, which was recently published on ARXIV, was published 500 from AI Chatbots, supported by the Great Language Form in Openai GPT-4O Mini, described by specific characters. After that, a simple social media platform that does not contain ads was unleashed and there are no algorithms that offer content discovery or recommended posts in the user’s summary. This chat was assigned to interact with each other and the content available on the statute. Over the course of five different experiences, all of which included streams that participate in 10,000 procedures, robots tend to follow other users who have shared their political beliefs. It was also found that users who have published the most partisan content tend to get most followers and republishing.

The results do not talk well about us, given that the chat keys were aimed at repeating how humans interact. Of course, nothing of this is really independent of the effect of algorithm. Robots have been trained in human interaction that has been defined for decades now through how we behave over the Internet in a algorithm -dominated world. They are already simulating toxic versions of ourselves, and it is not clear how we return from that.

To combat self -polarization, the researchers tried a handful of solutions, including providing time nutrition, reducing the value of viral content, hiding the numbers of followers and republishing, hiding the user profiles, and amplifying opposition views. (The latter, the researchers achieved success in Previous studyThat was able to create a high and low -toxic participation in a simulator social platform. In the simulation that HID User BIOS, the party gap actually increased, and extremist posts got more attention.

Social media seems to be a structure that may not be simply defended for humans to move without strengthening our worst instincts and behaviors. Social media is a mirror of an enjoyable house for humanity. It reflects us, but in the most distorted way. It is not clear that there are strong lenses enough to correct how we see each other online.



https://gizmodo.com/app/uploads/2022/11/ce8c1f48ada8f5fa2672ae1c66b92875.jpg

Source link

Leave a Comment