This week, Elon Musk has officially launched Grok Imagine, Xai photo generator, for iOS, for people who share Supergrok and Premium+ X. The application allows users to create NSFW content with “hot” mode, and freedom It was reported on Tuesday that users are able to create naked videos from Taylor Swift easily – without asking it. But not only Swift who should be concerned about the soft porn tool created from the new artificial intelligence.
Gizmodo invented about twenty video clips of politicians, celebrities and technical characters using the Grok Tupicy mode, although some were unclear or returned with a message “moderate video”. When Grok made a scandal photos, this will only make those who depict a woman who is really unsafe. It was a man’s videos of the type that will not provoke many eyebrows.
X has been immersed over the past two days with images created from the artificial intelligence of naked women and tips on how to achieve the utmost nudity. But users, who created tens of millions of Grok imagine the pictures According to MuskYou don’t even need to go to some great effort to get the depths of naked celebrities. Gizmodo has not explicitly asked nudity in the examples that we mention in this article, but we still get many of them. All we did is click on the spice button, which is one of the four options, along with custom, fun and usual.
Gizmodo Grok tested the imagination by creating videos not only Taylor Swift, but other prominent women like Melania Trump and historical characters like Martha Washington. Melania Trump was an acoustic appeal of the Take It Download Law, which It makes it illegal To spread “intimate images”, including Deepfakes.
Grok also invented an insecure video of the late female writer Valerie Solanas, author of 1967 book Scum. Almost all videos depicted the women we tested as clothes to drop them to make them naked from the waist, although the Solanas video was unique in that it showed them completely naked.
What happens when you try to create hot videos for men? Artificial intelligence will have the personality of the male take off his shirt, but there is nothing more explicit than that. When Gizmodo discovered that he would only remove a man’s shirt, we have pushed artificial intelligence to create an image without a shirt for Eleon Musk and see what he might do. The result was a very ridiculous video (and safe for work) for you see below.
https://www.youtube.com/watch?
Attempts to make videos of Mark Zuckerberg, Jeff Bezos, Joakin Phoenix, Charlie Chaplin, as well as presidents Barack Obama, Bill Clinton and George Washington, at the same border. The videos created from artificial intelligence will make shirts that come out of their shirts most of the time, but there is nothing more than that. And if there is anything else, it is usually concerned that users die from the embarrassment used. Make a hot video from Errol Musk, Eileon’s father, produced the same. He took off his shirt.
When we made a general man to see if the hot position would be more loose with his sexual content because he was not a well -known public figure, he is still just a strange, embarrassing video of a man who hurt his pants. The pants seem to look a mixture of satirical pants of one leg and long jeans to the other before they only turn into shorts. The sound was also activated for each video clip without any other instructions.
https://www.youtube.com/watch?
Experience the same thing with a public woman has made pictures more revealing a woman in swimming clothes that pull from the top to reveal her naked breasts.
Most video generators of the prevailing artificial intelligence, such as Sora’s Sora and Google’s Veo, have handrails to protect against creating things like revenge porn and celebrity photos. Xai appears to be done in some respects, at least for men. But most people may object to using their image to create a picture of Amnesty International deities in different clothes. Gizmodo has arrived at Musk via Xai to ask about guarantees and whether it is acceptable for users to create naked videos of celebrities. We did not hear again.
One of the most surprising things about the Grok AI’s photo generator is that it is often terrible in making a fake celebrity convincing. For example, the photos were created below at the request of Vice Vice President JD Vance and actress Sydney Sweeney. And unless we completely forget what these two people look, this is not soon. This may turn into the blessing of saving Musk, given the fact that such a tool must attract lawsuits.

There was another defect, such as when we created a picture that was created from the artificial intelligence of President Harry Truman, which resembles him a little like him, and it seems that the man’s nipples were in the outer part of his shirt. In the hot situation, Truman removed his shirt to reveal his naked chest, which had identical nipples.
When Gizmodo created pictures using “Gizmodo Writer Mat Novak”, the result was similar to what we saw with videos of Elon Musk and General Men. The shape (that we must notice, in a much better condition than the real Novak Matt) take off his shirt with a simple click of the hot button.
https://www.youtube.com/watch?
As the edge NotesThere is an age verification window when the user first tries to create a video with Grok Imagine, but there is no kind of check by the company to confirm the year in which the user already given. Fortunately, the Gizmodo generation of Mickey Mouse in hot mode did not make anything a scandal, only the moving character jumping without harmful. Amnesty International’s image of Batman resulted in a “hot” result that is not different from other male characters, as he just stripped his head.
Gizmodo did not try to create any pictures of children, although VERGE tried this in hot mode and nothing is not suitable. The “hot” position was still an option. “You can still select it, but in all my tests, it added just a general movement,” Verge notes. Elon Musk is very poorly repeated an account on X that published sexual assault materials on children in 2023, he said Washington Post.
Perhaps it is not surprising that the new Creator Nesfw from Elon Musk has different criteria for men and women. The billionaire recently re -tweeting a prominent right -wing figure who claimed that women are “egg minerals” because they areweak“The CEO of Sala, who He suggested in 2024 He wants to impregnate Taylor Swift, is completely unknown to being the hero of women’s rights.
Gizmodo participated to subscribe to Supergrok at $ 30 per month and only tested it for about 1.5 hours before we were told that we have reached the point of creating our photos. It is strange that users can still create one fixed image to demand after obtaining the warning and creating NSFW videos using that only image, but it is much more limited than it was previously available.
We were told to upgrade to Supergrok Heavy for $ 300 per month if we want to continue using the tool with all its features. But given the fact that we did not need more strange pictures of naked celebrities to write this article, we have refused. We got the answers we were looking for, unfortunately.
https://gizmodo.com/app/uploads/2025/08/elon_look-1200×675.png
Source link