Futurism logo

Can You Spot the Fake? Most People Can’t — And That’s a Problem

In an age of misinformation, your instincts aren’t enough.

By JPNPublished 8 months ago 4 min read
Can You Spot the Fake? Most People Can’t — And That’s a Problem
Photo by Serhii Tyaglovsky on Unsplash

We’re living in a world where your eyes can’t be trusted. That video of a politician saying something wild? It might be AI. The voice message from your “mum” asking for help? Could be synthetic. Deepfakes aren’t just movie magic anymore — they’re in your social feeds, your inbox, even your memories. And the scariest part? Over half of us can’t tell what’s real from what’s fake.

Let’s talk about why that matters — and what we’re supposed to do about it.

How the Hell Did We Get Here?

Deepfakes started as a research experiment. Now? They’re a cultural earthquake.

Behind the scenes, it’s all powered by something called GANs — Generative Adversarial Networks. Basically, one AI tries to fake it, the other tries to catch it. They keep battling until the fake is so good, even machines get confused. Today’s deepfakes can mimic 72 facial muscles and hundreds of vocal quirks. And with tools like StyleGAN3, they don’t just copy your face — they become your face, lighting and all.

Worse? You don’t need a supercomputer anymore. Apps like DeepFaceLab and Reface make it drag-and-drop easy. Millions of people are swapping faces for fun, but the line between “funny” and “fraud” is getting paper-thin.

It’s Not All Bad — But It’s Not Good Either

🎬 The Cool Stuff

- Movies use deepfakes to bring back younger versions of actors (The Irishman, anyone?).

- Surgeons train on synthetic patients that bleed, blink, and respond — way more effective than plastic mannequins.

- ALS patients are getting their voices back, using less than 10 minutes of old recordings.

- Brands are seeing better engagement from AI-generated ambassadors who speak 89 languages and don’t ask for days off.

💣 The Scary Shit

- Deepfakes have messed with 38 elections in just the past two years.

- Some scammers create fake nudes of teens and blackmail them for crypto. Yep. That’s real.

- Studios tried to pay background actors once to own their likeness forever. 98% less than what they’d pay a real person.

Your rep? It can be destroyed in one click. Deepfakes cost people billions in lawsuits, lost jobs, and reputational hell.

The Trust Problem

The real damage isn’t just what we see. It’s what we believe.

Nine out of ten people in the UK say they’re worried about deepfakes messing with the truth. Journalists are already under fire — a fake BBC report about the royals wiped £12 billion off the markets before anyone could fact-check it.

Students are using deepfakes to fake evidence in academic disputes. Teachers? Many say they can’t tell if an assignment was written or synthesised. And conspiracy theories? They’re thriving. When people see something fake that confirms their beliefs, they don’t question it — they double down.

So Who’s in Charge Here?

Lawmakers are scrambling to catch up. Here’s the rundown:

Australia: 7 years in prison for non-consensual deepfake porn. About time.

EU: If it’s synthetic, it needs a watermark. Simple, clear, and enforceable — on paper, at least.

California: Political ads need to disclose AI manipulation… unless they’re hosted overseas, which 83% now are.

Meanwhile, platforms aren’t stepping up. Only 12% use deepfake detection properly. Most just hope we don’t notice.

Can We Even Spot a Fake Anymore?

AI detectors are trying. They scan for subtle stuff like how your pupils react to light or how your skin pulses with your heartbeat. Sounds impressive, right? Still, the best detectors only catch around 60% of modern deepfakes. Compress the video for Instagram or TikTok, and that drops to 30%.

People? We’re worse. On average, we catch deepfake videos just 1 in 4 times. And if it confirms something we already think is true, we’re way more likely to believe it — even if it’s fake.

Where’s the Line Between Creativity and Creepy?

This is where things get murky.

On one hand, you’ve got deepfake films winning awards (Synthetic Symphony had Beethoven collabing with Mozart — wild). On the other, artists are losing income because AI clones are undercutting them. And in porn? 92% of deepfakes target women, often using private content without consent. That’s not innovation. That’s abuse.

Consent laws are coming — slowly. The 2025 Digital Likeness Protection Act sounds good, but loopholes let “parodies” slide through. And nobody’s regulating how apps scoop up your biometric data. You clicked “yes” to terms once. Now they own your face.

What’s Coming Next? Buckle Up.

- Holographic deepfakes — they’ll be in your room, not just on your screen.

- Real-time face swaps on video calls — good luck trusting Zoom ever again.

- Decentralised, untraceable deepfakes on the dark web — already 47% of marketplaces offer them.

Experts are calling for global standards, digital watermarks, and even AI education starting in primary school. Some are pushing for ethical boards to review synthetic media, like a kind of AI FDA.

Most of us? We just want someone to fix it without killing creativity or freedom of speech.

So, What Do We Do Now?

We’re staring down the barrel of a world where seeing isn’t believing. The deepfake dilemma isn’t just about tech. It’s about truth. About trust. About whether society can hold together when reality itself is up for grabs.

So here’s the deal:

If you create AI content, own it. Be transparent.

If you see something shocking, question it before you share it.

Push for laws that make sense, and platforms that actually give a damn.

And above all — stay curious, stay critical, and stay human.

Because if we lose our grip on what’s real, we lose a lot more than just credibility.

We lose the truth itself.

tech

About the Creator

JPN

I write what most people think but never say about everything and nothing, life’s weird detours and deep dives. No niche, just raw takes, sometimes real stories, and sometimes just as much about nothing at all. But always an honest take.

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments

There are no comments for this story

Be the first to respond and start the conversation.

Sign in to comment

    Find us on social media

    Miscellaneous links

    • Explore
    • Contact
    • Privacy Policy
    • Terms of Use
    • Support

    © 2026 Creatd, Inc. All Rights Reserved.