Skill Level: Beginner
What This Guide Is About
Deepfakes aren’t science fiction anymore — they’re an active political weapon. AI-generated videos, cloned voices, and fake “leaks” are being used to manipulate elections, discredit journalists, and smear activists. This guide teaches you how to spot deepfakes, verify sources, and protect yourself from disinformation designed to divide and confuse.
Why This Matters
In 2024, AI-driven propaganda spiked across every major platform — especially around elections and war coverage. Now in 2025, deepfake tech is cheaper, faster, and politically weaponized. Governments, extremists, and troll farms all use AI media to create chaos and erode trust in democracy.
If the Resistance can’t tell truth from illusion, we risk wasting our energy fighting ghosts instead of fascists. Staying sharp is part of staying free.
How to Spot Deepfakes
- Look for blinking and breathing: Most deepfakes miss natural human rhythm — irregular blinking, no visible breathing, or mismatched lip movement are giveaways.
- Check the ears and hair: AI often glitches on detailed textures like hairlines or earrings. If it looks painted or floating, it probably is.
- Listen for the “AI breath”: Synthetic voices often have overly crisp consonants, missing breaths, and oddly uniform pacing.
- Reverse-image search still works: Upload suspicious video stills to Google Images or TinEye. If it appears nowhere else, be suspicious.
- Confirm context: Deepfakes often attach to real events. If someone claims “this just happened,” check time stamps, locations, and credible outlets.
How to Defend Yourself
- Slow your share reflex. Even outrage clicks are algorithmic gold. Take one minute to verify before reposting.
- Cross-verify across platforms. Real footage leaves traces. If something only exists on one partisan account, it’s suspect.
- Follow independent verification orgs like Bellingcat, ProPublica, and Reuters Fact Check.
- Lock down your digital identity. Use multi-factor authentication and disable “voice print” features in apps — voice cloning scams are rising fast.
- Educate your circle. The best armor is community awareness. Share fact-checks in your group chats and resistance networks.
Kitty Tip of the Day
Before you believe a “bombshell” video, ask: Who benefits if I’m angry?
If the answer is “someone who profits from chaos,” step back and sharpen your claws of discernment.
Why It’s Important for the Resistance
Authoritarian regimes thrive on confusion — if you can’t tell what’s real, you stop trusting everyone. That’s the psychological goal. Every resister who can verify truth becomes a firewall in the information war. Your skepticism is a patriotic act.
Kitty’s Final Scratch
Democracy doesn’t die in darkness — it dies in distortion. We don’t need new heroes; we need better filters. Trust your gut, trust the verified, and keep your paws off that share button until the truth passes the sniff test.
This kitty fights fascist fakes and naps only after debunking.
Sources
- Washington Post: The potential and pitfalls of political AI
- Reuters Fact Check
- Bellingcat Investigations
- ProPublica