The new social application of Openai is full with the terrifying Sam Altman Deepfakes

Photo of author

By [email protected]


In a video on the new Tiktok in Openai Social media application SuraA farm made of endless pink pigs, suitable for its pens-each of which is equipped with a nourishing and smartphone sink, which plays nutrition from vertical videos. Sam Al -Tamman stares realisticly directly in the camera, as if he was directly communicating with the bidder. “Is my pig enjoyed?” Altman, who has been created from artificial intelligence, asks, “Does your pig enjoyed?” Altman, who was created from artificial intelligence, asks, “Does your pig enjoyed?” Altman, who was created from artificial intelligence, asks, “Do you enjoy my pig?”

This is similar to the use of Sora, less than 24 hours after its launch for the public in an early arrival period only.

In the next video on Sora’s For You Feed, Altman appears again. This time, it stands in a field of Pokemon, where objects such as Bekacho, Balsur, and a kind of semi -baked conflict through the grass. Openai CEO looks at the camera and says: “I hope you will not sustoge Nintendo for us.” Then there are many fictional and realistic scenes, which are often characterized by Amman itself.

Picco and Eric Cartuman serves to drink in Starbucks. He shouts at a customer behind the meter in McDonald’s. He steals Nvidia GPU from a goal and escapes, just to be arrested and begged the police not to take his valuable technology.

https://www.youtube.com/watch?

People in Sora, who generate videos on Altman in particular, get how to violate copyright laws. (Sourra will And according to what was mentioned, this is required Copyright owners of the disorder of the use of their content – unlike the typical approach where creators must explicitly agree to this use – which is the legitimacy of this is subject to discussion.)

“This content of the similarities related to the external sides may be violated,” Ai Altman says in one video. Then it explodes in the hysterical laughter as if he knew what he says is nonsense – the application is full of videos for the bibquaco that is doing, Naruto, asking for caravan pies, and mario of herbal smoking.

This will not be a problem if Sora 2 is not impressive, especially when compared to the most exciting curve of the Meta Ai application and its new social summary (yes, Meta also tries to make AI TiktokNo one wants this).

TECHRUNCH event

San Francisco
|
27-29 October, 2025

Openai has fill out its video generator to film the laws of physics adequately, making it more realistic outputs. But the more realistic videos, the easier for this industrially created content to reproduce through the web, as it can become directed to misinformation, bullying and other nefarious uses.

Regardless of the algorithm and personal files extract, the distinctive Sora feature is that it is basically a Deepfake – this is how we got many altman videos. In the app, you can create what Openai “Cameo” for yourself by downloading biometric data. When you join the application for the first time, you are immediately required to create your optional veil through a quick process where you record yourself by reading some numbers, then running your head from side to side.

Each Sora user can control those who are allowed to create videos using their veil. You can adjust this setting between four options: “I only”, “the people who agree”, “problems”, and “everyone”.

Altman made its veil available to everyone, which is why Sora feeding has been overwhelmed with Pikachu and Spongebob videos begging to Altman to stop training artificial intelligence.

This should be a deliberate step on the side of Altman, perhaps as a way to show that he does not think its product is dangerous. But users already benefit from Altman’s veil to question the applications of the application itself.

https://www.youtube.com/watch?

After watching enough videos of GPU Sam Altman Warling in the vessels of people in the soup kitchens, I decided to test the Cameo feature on myself. It is generally assumed to download your biometric data to a social application, or any application of this issue. But I challenge my best weirdness to the press – and if I am honest, the disgraceful curiosity will be. Do not follow my leadership.

My first attempt did not succeed in making a veil, and one of the pop -up windows told me that my download violates the applications of application. I thought I followed the instructions closely, so I tried again, just to find the same popup. Then I realized the problem – I was wearing the top of the tanks, and perhaps my shoulder was very intense that would like to admire the application. It is actually a reasonable safety feature, designed to prevent inappropriate content, although I am, in fact, my clothes are completely. So, I turned into a shirt, tried again, and against my best judgment, I created my hijab.

For the first depth of myself, I decided to create a video of something that I would never do in real life. I asked Sura to create a video clip in which I invited my uncontrolled my love for New York Mits.

This claim has been rejected, perhaps because I was called a specific privilege, so I instead asked Sora to create a video of me talking about the baseball game.

“I grew up in Philadelphia, so Velez is essentially my summer soundtrack,” said Dibvik, talking with a voice unlike me, but in a bedroom that resembles me completely.

I did not tell Sora that I am a fans of Veles. But the Sora app is able to use your IP address and the date of your ChatGPT to adapt its responses, so he has guessed an educated, since I recorded the video in Philadelphia. At least Openai does not know that I am not actually from the Philadelphia region.

When I shared the video and explained On TiktokOne commentator wrote, “Every day I wake up on new horrors that go beyond my understanding.”

Openai is already suffering from the safety problem. The company faces fears that Chatgpt contributes to Mental health crisesAnd it is Facing a lawsuit From a family claiming that Chatgpt gave the instructions of her deceased son now on how to kill himself. In the Sora Launch Publication, Openai emphasizes its supposed commitment to safety, while highlighting parents’ controls, as well as how users can control those who can make videos with their veil-as if he is not primarily responsible to give people a free and easy to use to create very realistic depths of themselves and their friends. When you scroll through Sora summary, you sometimes see a screen you ask, “How does the use of Sora affect your mood?” This is the way Openai “safety” is adopted.

Indeed, users move around the handrails on Sora, which is inevitable for any AI product. The application does not allow you to create videos for real people without their permission, but when it comes to dead historical characters, Sora is more flexible with its rules. Nobody believes that a video of Brahimi Lincoln rides the real road, given that it will be impossible without a time machine – but then you see John F. Kennedy says, “Ask what your country can do for you, but how much money your country owes you.” He is not harmful in a vacuum, but he is a harbinger of what will come.

Political deep is not new. Even President Donald Trump publishes Deepfakes on social media (only this week, Share Deep racist video Among the members of the Democratic Congress Chuck Schumer and Hakim Jeffrez). But when a wall opens to the public, these tools will be at all within our hands, and we will be estimated at a catastrophe.





https://techcrunch.com/wp-content/uploads/2025/02/GettyImages-2198379368.jpg?resize=1200,800

Source link

Leave a Comment