Imagine that you contract an illness and a doctor proscribes you an antibiotic with directions to take one pill for 10 days. You take the pill for the first few days, feel better, and then decide to stop taking the pill. Here’s what will happen next: the pill kills the weak bacteria but the strong ones remain, multiply and the illness comes back stronger. You’re worse off. This error occurs frequently, and it is not limited to medicine.
In many domains, humans tend to equate a marginal improvement with security in the long run, when such improvements only conceal more consequential harm down the road. For example, research demonstrates that drivers drive more aggressively when they are wearing a seat belt. Financial risk management similarly encourages us to take larger risks. The same thing occurs in education. A degree gives us some knowledge, but it also leads to overconfidence and what psychologist term the illusion of understanding—the idea that we equate familiarity with a topic with full knowledge of that topic, which can lead to larger errors.
The irony is as the world becomes more complex, this thinking mistake worsens. That is to say, as complexity increases, instead of critically examining the reality of a situation, we hire consultants, take more pills, or implement small changes that only deliver small positive gains. The result is often that we are worse off than an initial position, to which we respond by repeating the same mistake. This negative feedback loop is not inevitable—in some domains, like the airline industry, in which flying becomes safer after each accident, we see the opposite—but it is an unmistakable trend, visible across many domains.
Just like removing street signage causes drivers to think carefully (and fewer accidents), a difficult to read font forces us to think critically (and reduce errors). Disfluency + page 41 in the Laws of Subtraction.