For years, the so -called “Nudify” applications and web sites Get online, allowing people to create an expanded and abusive images Women and girlsIt includes Sexual assault on children. Although some legislators and technology companies take steps to reduce harmful services, millions of people are still reaching websites, and millions of dollars can make the website creators every year.
An analysis of 85 “clothing” and “dressing” sites – which allows people to download pictures and use artificial intelligence to create “naked” pictures of topics that contain only a few clicks – that most sites depend on Google, Amazon and Cloudflare to work and stay online. Results, It was revealed by the indicatorAnd it is a leaflet looking for digital deception, saying that the websites were on average combined 18.5 million visitors to each of the past six months, and it may reach $ 36 million per year.
Alexios Mantzarlis, a founder of the index and online safety researcher, says the mysterious naked ecosystem has become a “profitable work” that allowed “Laissez-Faiire’s approach to Silicon Valley”. “They should have stopped providing any and all services to the artificial intelligence contract when it was clear that the only issue of their use is sexual harassment,” says Mantzarles of technology companies. It has become increasingly illegal Create or share explicit Deepfakes.
According to the research, Amazon and Cloudflare provides hosting or content delivery services for 62 out of 85 sites, while Google’s login system was used on 54 websites. Nudify sites also use a set of other services, such as payment systems, which are provided by major companies.
Amazon Web Services Ryan Walsh says AWS has clear terms of service that require customers to follow “applicable” laws. “When we receive reports of potential violations of our conditions, we are acting quickly to review and take steps to disable the banned content,” says Walsh, adding that people can report problems to their safety teams.
“Some of these sites violate our conditions, and our teams take measures to address these violations, in addition to working on long -term solutions,” says Karl Ryan, a spokesman for long -term solutions, noting that the Google login system requires developers to agree to his policies that prohibit illegal content and content that bother others.
Cloudflare did not respond to WIRED to comment at the time of writing this report. Do not name Wire on Nudifier websites in this story, so as not to provide them with more exposure.
Nudify and take off clothes on websites and Robots Ownership It has flourished since 2019After spawning the tools and operations used to create The explicit “Deepfakes”. Networks Interconnected CompaniesAs Bellingcat mentioned, it appeared online technology and earning money from the systems.
On a large scale, services use Amnesty International to convert images into explicit uniform images; They often earn money by selling “credits” or subscriptions that can be used to create images. It has been shipped by the wave of artificial intelligence generators that have appeared in the past few years. Their result is greatly harmful. Pictures of social media It was stolen and used Create offensive photos. At the same time, in A new form of electronic bullying and abuseTeenage children all over the world Pictures of their colleagues have been created in the classroom. Such intimate use of intimate images for victims, and can be images It is difficult to clean from the web.
https://media.wired.com/photos/68718d3a4ec5a39a3df685b4/191:100/w_1280,c_limit/security_ai_nudify_tech.jpg
Source link