Teaching Logical Fallacies Isn’t Teaching How to “Think”

There’s no all-purpose substitute for subject knowledge.

Lucia Bevilacqua
3 min readJun 18, 2021

You’re concerned about the troubling precedent a viewpoint would set? There’s a term for that — the slippery slope fallacy.

You feel it would violate basic human sensibilities? Nice appeal to emotion there.

Your skepticism is shaped by your experience? You must think anecdotes count as valid data.

You spot a conflict of interest that could threaten the credibility of my sources? Don’t you know that’s an ad hominem?

You don’t want the population subjected to an unprecedented experiment? Just an appeal to nature fallacy.

If traditional school subjects fail to “teach critical thinking,” as critics say, what should be taught instead? A common answer is how to identify logical fallacies.

Sure, this can come in handy. Surely we don’t want the young generation swayed by sophistry without substance. A lesson in English class as part of the rhetoric unit couldn’t hurt.

But here’s the more radical implication: just by knowing how to think about fallacies, learners will no longer be blind about what to think. Who needs specific knowledge? With this transferable information, they’ll be able to discern “the right side” without even knowing anything about an issue.

In reality, truth-seeking doesn’t work this way. All a fallacy means is that the form of an argument is incorrect, not that the claim it’s trying to argue is incorrect. Even “the right side” can make fallacious arguments, maybe just as often as “the wrong side” does. (See pro-choice fallacies, pro-life fallacies, religious fallacies, atheist fallacies…)

Picking “the right side” in debatable matters doesn’t have such a handy set of shortcuts. Critical thinkers need to evaluate the relative strength of each side’s evidence. They can’t weigh both sides if they don’t know what’s on both sides.

Simply Googling them is no guarantee they’ll understand. That’s where traditional subject knowledge comes in:

  • Thinkers need enough reading comprehension to figure out what an English-language source is arguing and why it’s arguing it. (If they can’t even do that for a short passage on a standardized reading test, can we trust them?)
  • A good foundation of history and classic literature allows them to draw patterns in the human condition (“This is like the Spanish flu” or “This is like 1984”), or at least get the reference when others do.
  • Knowing scientific terms allows them to grasp more advanced scientific explanations. Understanding important laws, theories, and mechanisms in physics, chemistry, and biology allows them to question anything that seems downright implausible.
  • There will always be new papers getting published, new headlines to share on social media as “evidence” for your side. Evaluating their quality takes knowledge of study design and statistical interpretation. Of course, this involves more than “correlation =/= causation” and “small sample sizes are unreliable,” but that’s a good place to start.

And if there’s no real debate? If it’s really important to know what’s settled knowledge? It’s fine just to say so. It doesn’t stifle people’s critical thinking ability —it actually helps them think critically in the future. Successful critical thinking depends on having the right set of priors.

That’s why I support a curriculum rich in knowledge, not generic lessons on fallacies intended to teach “how to think.”

--

--