Dan J. Harkey

Master Educator | Business & Finance Consultant | Mentor

The Big Lie’s Perpetual Life Lives on Over and Over: Part II of III

The Point of Studying the Big Lie

by Dan J. Harkey

Share This Article

Quoting Mein Kampf is uncomfortable—and it should be.  But the goal is not to amplify a propagandist.  The goal is to name a mechanism that still operates whenever fear, identity, and repetition fuse into “common knowledge.”

The big lie succeeds when people think no one would say something that huge unless it were true.

The antidote is a mature civic reflex: Huge claims require huge evidence, especially when they flatter our side and vilify a target.

Truth does have an advantage.  But it is not automatic.  It must be defended by systems, by norms, and by the daily choices of readers who refuse to confuse familiarity with fact.

Illusory Truth Effect (a.k.a. “truth-by-repetition”): what it is and why it happens

The illusory truth effect is a well-replicated cognitive bias where people judge a statement as truer (or more accurate) simply because they’ve encountered it before, even when repetition provides no new evidence.

Familiarity can feel like truth.

Truth is not helpless.  Truth needs a strategy.

Building Resistance to falsehoods: A Civic Immune System

The best defenses are not purely informational; they are behavioral and structural—habits and systems that reduce exposure, slow reflex sharing, and raise standards for credibility.

1) Slow the spread with friction

If repetition increases perceived truth and sharing, the simplest intervention is to reduce impulsive repetition—by pausing before reposting, checking the source, and seeking corroboration.

2) Prefer primary sources and transparent methods

Propaganda thrives where verification is difficult.  Nazi messaging worked in part through broad control of communication channels and curated narratives.

Today, the parallel is not always censorship; it’s information overload—so credibility must be earned through traceable evidence.

3) Watch for the “villain + urgency” formula

When a message insists you must act now and identifies a group as the hidden cause of everything, treat it as a warning label, not a call to arms.  Nazi propaganda repeatedly defined enemies and encouraged public indifference to their fate.

Takeaway: If a claim demands instant outrage and offers one scapegoat for many problems, it is advertising—whether political or commercial.

4) Inoculate with media literacy, not just fact lists

The Holocaust Memorial Museum’s educational materials emphasize critical thinking about propaganda’s effects—learning how messages work, not merely what is true.
That approach scales: teach pattern recognition—loaded language, false dilemmas, dehumanization, manufactured victimhood—so the technique becomes visible.

The big lie survives when citizens outsource judgment.  It weakens when ordinary people insist on receipts.

The core mechanism: “processing fluency.”

Psychologists explain the effect mainly through processing fluency—the ease with which your brain reads, recognizes, or thinks about something.

When a statement is repeated, it becomes easier to process, and your brain often (mis)uses that ease as a cue that the statement is reliable or correct. 

Why would the brain do that?  Because in everyday life, familiar information often is accurate (you encounter real facts repeatedly), so fluency is usually a decent shortcut—until it’s exploited by misinformation, advertising, or rumor.

Key research findings (what the evidence shows)

1) Repetition reliably boosts “truth” ratings

Across experiments, repeated claims are rated as more truthful than new claims, even when the evidence remains unchanged. 

2) The most significant jump happens fast—often on the second exposure

When researchers increased repetitions well beyond the usual 1–3, they found that truth judgments rose logarithmically: the most significant increase occurred with the second exposure, with diminishing returns thereafter.

3) Knowledge doesn’t fully protect you

A striking finding: people can show the illusory truth effect even when the repeated statement contradicts what they know.
In other words, you can “know better” and still feel the repeated claim is more plausible.

4) Repetition can increase willingness to share misinformation

Recent experimental work suggests that even a single prior exposure can make people more likely to share a statement, and that this is partly because repetition boosts perceived accuracy.

What makes the illusory truth effect stronger?

These factors tend to increase the effect:

  • More repetition (especially early repeats): The effect grows with repeated exposures, though gains taper as repetition increases.
  • Familiar formats and easy processing: Because fluency is the pathway, anything that makes a claim feel easy to process can strengthen perceived truth.
  • Short or moderate delays still show the effect: The effect appears across different time intervals; longitudinal work shows it can persist over time (though it may diminish with longer delays).

What can reduce it (or help you resist it)?

1) Add diagnostic friction: “What’s the evidence?”

Because repetition affects perceptions of truth, one of the best counters is to force a shift from fluency to evidence-based evaluation—asking what source, what data, what corroboration.

2) Don’t rely on “I’ve heard this before.”

A practical mental rule: “Familiar isn’t factual.” This directly targets the cue your brain is using (familiarity/fluency) rather than debating the claim’s content first.

3) Be careful when repeating misinformation—even to debunk it

If you restate a false claim without precise framing, you may inadvertently increase its familiarity.  Research on repetition-induced truth emphasizes that repetition itself is potent, so corrections should minimize unnecessary restatement and foreground the correct information.

4) Use “accuracy prompts” before sharing

Since repetition can increase sharing via perceived accuracy, a good habit is: pause, verify, then share—especially for emotionally charged claims.

A concrete example (everyday, non-political)

Imagine you scroll past the claim:

“Cold showers burn significantly more fat than exercise.”

You see it once—sounds dubious.  You see it again on a different account—your brain processes it faster.  By the third time, it feels like “something I’ve heard,” and that feeling can quietly raise the statement’s credibility in your mind—even if no evidence has appeared.  That is the illusory truth effect in action: familiarity inflates plausibility.

The takeaway

  • Illusory truth effect = repetition → familiarity/fluency → higher perceived truth.
  • It’s strongest early (the second exposure matters a lot).
  • It can occur even when you know the statement is false.
  • It can also increase sharing by boosting perceived accuracy.