I have to tell you about Sora 2. It’s OpenAI’s new video-generating app that’s both mind-blowing and terrifying. 

It’s the first tool from any AI company that lets you give it a prompt, and in literally seconds, you get a full-blown, AI-generated video up to a minute long. The results aren’t perfect, but they’re close. Like Hollywood close.

The lighting, camera motion, facial expressions, it’s all shockingly realistic.

Want to see a golden retriever surfing through Times Square in slow motion? Done. A drone shot of a city being built out of clouds? Easy. 

🎭 Dead celebrities

People are using Sora 2 to generate fake videos of dead celebrities doing things they never did. 

JFK is deepfaked into a WWE superstar.👉 Watch it on TikTok

Tupac appears with Mr. Rogers talking about respect.👉 Watch it on Instagram

Stephen Hawking is attacked in the UFC. Warning: I knew this was all AI, but it was still upsetting to watch. Weird, right?👉 Watch it on X

Even Sam Altman, CEO of OpenAI, is shoplifting GPUs from Target.👉 Watch it on Instagram

I’m sure you know that under U.S. law, “defaming” someone only applies to living people, not the dead. That means families and estates have no legal recourse when someone uses AI to humiliate or misrepresent their loved one. It’s a free-for-all right now, and no one’s accountable.

Even creepier? 

Sora is also being used for stalking and impersonation. All it takes is a photo, and you can make a video of anyone doing anything. Fake crimes, revenge content, political lies, it’s all possible. I have a warning about that and a fix for you in tomorrow’s newsletter.

😱 Zero guardrails

OpenAI says you need permission to use a person’s face or voice. Yea, like that’s going to stop someone. 

If the guy who runs OpenAI can’t stop his own face from being misused, what chance do the rest of us have? Right now, you can only get Sora 2 as an iPhone app. You’ll need an OpenAI account, and it’s still invite-only, so most people don’t have access yet. 

Sora 2 is an incredible tool. But it’s being abused, and the guardrails are flimsy at best. So from now on, when a video goes viral, you better assume it’s fake until proven real.

🎏 After that, you need a smile. In every koi pond of four or more, at least one is always fake. You’ve got kois A, B, C, and then the D koi. (lol)

The post Sora 2: The AI video tool that’s already out of control appeared first on Komando.com.