Dan J. Harkey

Master Educator | Business & Finance Consultant | Mentor

The Big Lie’s Perpetual Life Lives on Over and Over: Part I of III

Why Massive Falsehoods Still Win—and How to Stop Them

by Dan J. Harkey

Share This Article

Summary

Most people assume the truth has a built-in advantage. If something is false—wildly, obviously false—surely it will collapse under its own weight. And yet the opposite happens with disturbing frequency: the bigger the claim, the faster it travels.

A century ago, Adolf Hitler described this cynical insight in Mein Kampf, arguing that “the great masses… more easily fall a victim to a big lie than to a little one.”

 That observation is not valuable because of its author—whose ideology produced catastrophic human suffering—but because it reveals a recurring vulnerability in public life: the human mind’s tendency to confuse magnitude, repetition, and emotional resonance with credibility.

The ‘Big Lie’ is a deliberate propaganda technique, not just a mistake, designed to keep readers engaged and aware of its manipulative nature.

The “big lie” is commonly described as a propaganda device built on an extreme falsehood, repeated until challenging it feels pointless.

It relies on a simple assumption: if a claim is enormous, many people will presume “there must be something to it,” because inventing something so audacious feels socially unimaginable.

The scale of a big lie creates a powerful illusion of evidence, making the audience feel responsible to question claims that seem colossal—even when they lack basis.

This is why the big lie doesn’t need subtlety.  It needs volume, clarity, and a villain—a target blamed for complex problems the audience is already anxious about.

A big lie doesn’t persuade by proof; it persuades by making disbelief feel naïve.

What Nazi Propaganda Teaches—If We Let It

History shows what happens when the big lie is paired with state power and media control.  The U.S. Holocaust Memorial Museum documents how Nazi propaganda helped the regime win support, then facilitate persecution, war, and ultimately genocide.

It also notes that Nazi messaging often used stereotypes already familiar to the audience—existing prejudices that propaganda didn’t invent so much as weaponize and amplify.

Once in power, the Nazis established a propaganda ministry to shape public opinion through radio, film, education, books, and the press.
And they repeatedly used propaganda to create social permission—an atmosphere where escalating measures against targeted groups could be rationalized or ignored.

Takeaway: Propaganda is most dangerous when it normalizes cruelty, makes dissent feel lonely, and urges the audience to stay alert and compassionate.

The Museum’s research also describes “deceiving the public” as a recurring tactic—portraying Germany as a victim and disguising political aims to justify violence.

That pattern—victimhood narratives paired with scapegoating—remains a typical architecture of mass persuasion because it converts fear into unity and complexity into certainty.

When a movement convinces people, they are under attack; almost any act can be sold as “self-defense.”

Psychology: Why Repetition Beats Reason

Modern cognitive science reveals that the illusory truth effect explains why repeated statements seem more accurate, helping readers grasp how falsehoods persist.

Repetition increases “processing fluency”—ease of mental processing—which the brain misreads as a signal for accuracy.

Here’s the part that should sober every well-informed person: research indicates that knowledge does not fully protect against this effect.  In controlled studies, people can rate repeated falsehoods as more believable even when the claim contradicts what they know.

Takeaway: Familiarity can masquerade as truth—even in educated minds.

Repetition also influences behavior, not just belief.  Recent experimental work suggests that even a single prior exposure can increase the likelihood that someone will share misinformation, in part because repetition boosts perceived accuracy.

So the modern big lie doesn’t require a dictator’s ministry.  It can thrive inside attention economies where algorithms reward engagement and where “I’ve heard that before” becomes a shortcut for “that must be right.”

The brain’s efficiency trick—using familiarity as a truth cue—becomes a liability in a high-volume media world.

Why “Debunking” Often Fails

If repetition helps falsehoods, why not just repeat corrections?

Because corrections compete with three realities:

·       The first impression advantage.  The initial story is sticky; later updates must contend with an established mental model, reminding the audience to be patient and persistent in seeking the truth.

·       Asymmetry of effort.  A lie can be short and emotional; a refutation is often longer, more conditional, and more boring.

·       Identity gravity.  When a claim becomes tribal—“people like us know this”—contradictory facts can feel like social betrayal.

Takeaway: A lie is a story; a correction is often a footnote.