Correlation implies causation (logical fallacy)

From Academic Kids

Correlation implies causation, also known as cum hoc ergo propter hoc (Latin for "with this, therefore because of this") and false cause, is a logical fallacy by which two events that occur together are claimed to be cause and effect.

For example:

Teenage boys eat lots of chocolate.
Teenage boys have acne.
Therefore, chocolate causes acne.

This argument, and any of this pattern, is an example of a false categorical syllogism. One observation about it is that the fallacy ignores the possibility that the correlation is coincidence. But we can always pick an example where the correlation is as robust as we please. If chocolate-eating and acne were strongly correlated across cultures, and remained strongly correlated for decades or centuries, it probably is not a coincidence. In that case, the fallacy ignores the possibility that there is a common cause of eating chocolate and having acne. See joint effect.

For example:

Ice-cream sales are strongly (and robustly) correlated with crime rates.
Therefore, ice-cream causes crime.

The above argument commits the cum hoc ergo propter hoc fallacy, because in fact the explanation is that high temperatures increase crime rates (presumably by making people irritable) as well as ice-cream sales.

Another observation is that the direction of the causation is wrong and should be the other way around.

For example:

Gun ownership is correlated with crime.
Therefore, gun ownership leads to crime.

The facts could easily be the other way round: increase in crime could lead to more gun ownership with concerned citizens. See: wrong direction.

The statement "correlation does not imply causation" notes that it is dangerous to deduce causation from a statistical correlation. If you only have A and B, a correlation between them does not let you infer A causes B, or vice versa, much less 'deduce' the connection. But if there was a common cause, and you had that data as well, then often you can establish what the correct structure is. Likewise (and perhaps more usefully) if you have a common effect of two independent causes.

But while often ignored, the advice is often overstated, as if to say there is no way to infer causal structure from statistical data. Clearly we should not conclude that ice-cream causes criminal tendencies (or that criminals prefer ice-cream to other refreshments), but the previous story shows that we expect the correlation to point us towards the real causal structure. Robust correlations often imply some sort of causal story, whether common cause or something more complicated. Hans Reichenbach suggested the Principle of the Common Cause, which asserts basically that robust correlations have causal explanations, and if there is no causal path from A to B (or vice versa), then there must be a common cause, though possibly a remote one.

Reichenbach's principle is closely tied to the Causal Markov condition used in Bayesian networks. The theory underlying Bayesian networks sets out conditions under which you can infer causal structure, when you have not only correlations, but also partial correlations. In that case, certain nice things happen. For example, once you consider the temperature, the correlation between ice-cream sales and crime rates vanishes, which is consistent with a common-cause (but not diagnostic of that alone).

In statistics literature this issue is often discussed under the headings of spurious correlation and Simpson's paradox.

David Hume argued that any form of causality cannot be perceived (and therefore cannot be known or proven), and instead we can only perceive correlation. However, we can use the scientific method to rule out false causes.

Humorous example

An entertaining demonstration of this fallacy once appeared in an episode of The Simpsons (Season 7, "Much Apu about Nothing"):

Homer: Not a bear in sight. The "Bear Patrol" must be working like a charm!
Lisa: That's specious reasoning, Dad.
Homer: Thank you, dear.
Lisa: By your logic I could claim that this rock keeps tigers away.
Homer: Oh, how does it work?
Lisa: It doesn't work.
Homer: Uh-huh.
Lisa: It's just a stupid rock. But I don't see any tigers around, do you?
Homer: Lisa, I want to buy your rock.

See also

External links



Academic Kids Menu

  • Art and Cultures
    • Art (
    • Architecture (
    • Cultures (
    • Music (
    • Musical Instruments (
  • Biographies (
  • Clipart (
  • Geography (
    • Countries of the World (
    • Maps (
    • Flags (
    • Continents (
  • History (
    • Ancient Civilizations (
    • Industrial Revolution (
    • Middle Ages (
    • Prehistory (
    • Renaissance (
    • Timelines (
    • United States (
    • Wars (
    • World History (
  • Human Body (
  • Mathematics (
  • Reference (
  • Science (
    • Animals (
    • Aviation (
    • Dinosaurs (
    • Earth (
    • Inventions (
    • Physical Science (
    • Plants (
    • Scientists (
  • Social Studies (
    • Anthropology (
    • Economics (
    • Government (
    • Religion (
    • Holidays (
  • Space and Astronomy
    • Solar System (
    • Planets (
  • Sports (
  • Timelines (
  • Weather (
  • US States (


  • Home Page (
  • Contact Us (

  • Clip Art (
Personal tools