The Knowledge Illusion We Know Less Than We Think
Individual human knowledge is remarkably shallow. We walk through life believing we understand how things work, but our sense of understanding is an illusion that collapses the moment we try to explain anything in detail.
"Our point is not that people are ignorant. It's that people are more ignorant than they think they are. We all suffer, to a greater or lesser extent, from an illusion of understanding, an illusion that we understand how things work when in fact our understanding is meager."
Sloman and Fernbach call this the "illusion of explanatory depth." Ask someone to rate how well they understand a toilet, a zipper, or a bicycle, and they will report confident understanding. Then ask them to explain, step by step, how it actually works. Their confidence collapses. Before trying to explain something, people feel they have a reasonable level of understanding; after explaining, they don't. This is not an edge case it is the default state of human cognition. Most of what we call knowledge is little more than a bundle of associations, high-level links between concepts that we have never broken down into real causal stories.
The reason is evolutionary. The mind did not evolve to be a hard drive storing comprehensive models of reality. It evolved to extract only the most useful information for guiding action. Detailed understanding is expensive and usually unnecessary, because we can offload knowledge to the environment and to other people. "Our intelligence resides not in individual brains but in the collective mind." We are like bees in a hive impressive as a colony, limited as individuals. The knowledge illusion persists because we fail to draw an accurate line between what we know inside our own heads and what we are borrowing from tools, communities, and culture.
This illusion has real consequences for argumentation. People hold strong opinions on complex policy issues healthcare, climate, economics while possessing almost no mechanistic understanding of the systems involved. The cure is simple but painful: try to explain your position in causal detail. The act of explaining is the fastest way to discover you do not actually understand.
Takeaway: Before forming a strong opinion, try to explain the mechanism behind your position step by step the gaps you discover will humble you more than any counterargument.
See also: Reason Evolved for Argumentation Not Truth | Communication Usually Fails Except by Accident | Epistemic Legibility Not Everything Can Be Made Explicit | Wittgenstein's Ruler Measures the Measurer