Breaking News

This trend makes it even harder to know when a video is AI-generated


Did you know you can customize Google to filter garbage? Follow these steps for better search results, including adding my work at Lifehacker as a preferred source.


It’s scary how realistic AI-generated videos are becoming. What’s even scarier, however, is how accessible the tools to create these videos are. Using something like OpenAI’s Sora app, people can create hyper-realistic short-form videos of just about anything they want, including real people, like celebrities, friends, or even themselves.

OpenAI knows the risks of an application that facilitates the generation of realistic videos. As such, the company places a watermark on any Sora build you create through the app. This way, when you’re scrolling through your social media feeds, if you see a little Sora logo with a cute cloud with bouncing eyes, you know it’s AI-generated.

You Can’t Trust a Sora Watermark

My immediate concern when OpenAI announced this app was that people would find a way to remove the watermark, thus causing confusion on the internet. I wasn’t wrong: there are already plenty of options for interested parties who want to make their AI even more realistic. But what I didn’t expect was the opposite: people who want add Sora watermark on real videos, to make them look like they were created with AI.

I was recently scrolling – or maybe doomscrolling – on video is simply taken from one of Apple’s pre-recorded WWDC events – one where Federighi does parkours around Apple headquarters.

Later I saw this clip, which also uses a Sora watermark. At first glance, you might think this is an OpenAI product. But look closer, and you can see that the clip uses real people: the shots are too perfect, without the blur or glitches you tend to see with AI video generation. This clip simply spoofs the way Sora tends to generate multi-shot clips of people talking. (Astute viewers may also notice that the watermark is a bit larger and more static than the real Sora watermark.)

It turns out that the account that posted this second clip also created a tool to add a Sora watermark to any video. They don’t explain the thinking or purpose behind the tool, but it’s definitely real. And even if this tool didn’t exist, I’m sure it wouldn’t be too difficult to edit a Sora watermark in a video, especially if you weren’t concerned with replicating the movement of the official Sora watermark.

What do you think of it so far?

To be clear, people were already posting like this before adding the watermark tool. The joke is to say you did something with Sora, but instead post a popular or infamous music video — say, Drake’s Sprite commercial from 15 years ago, Taylor Swift dancing at the Eras Tour, or an entire movie. Sonic the Hedgehog movie. It’s a funny meme, especially when it’s obvious that the video was not directed by Sora.

Real or not real?

But it’s an important reminder to constantly be vigilant when scrolling through videos on your feeds. You should be on the lookout for clips that aren’t real, as well as clips that are actually real, but are advertised as AI-generated. There are many implications here. Sure, it’s fun to put a Sora watermark on a viral video, but what happens when someone adds the watermark to a real video of illegal activity? “Oh, this video isn’t real. All the videos you see without a watermark have been doctored.”

At the moment, it seems that no one has figured out how to perfectly reproduce the Sora watermark. So there will be signs if someone is actually trying to pass off real video as AI. But it’s all still a bit worrying and I don’t know what the solution could be. Perhaps we are heading towards a future in which Internet videos are simply considered unreliable across the board. If you can’t tell what’s real or fake, why try?

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button