Does Reading Cognitive Bias Research Distort the Mind?


Over break I read The Invention of Science: A New History of the Scientific Revolution by the British historian David Wootton. Wootton writes that modern science was invented between 1572 (when the Danish astronomer Tycho Brahe saw a nova) and 1704 (when Isaac Newtorn published Opticks). A big part of the revolution was technological. The telescope, barometer, and printing press allowed people to study the world with more precision and share their findings with scholarly communities, such as The Royal Society. But, more importantly, the scientific revolution involved a new set of conceptual tools.

Take the concept of discovery, for instance. Until around the 16th century, most scholars believed that humanity’s greatest achievements were in the past—Aristotle, the preeminent source for all intellectual inquiry, still towered over European thought like a colossus, despite his deeply flawed ideas about the nature of the universe. When Columbus sailed to America in 1492, he did not use the word “discover” to describe what he had done because he was not familiar with the concept. After Amerigo Vespucci used the new word in 1504, it quickly spread into other European languages. Soon, intellectuals of the era began to not only investigate the world in new ways. They began to treat the world as something to be investigated.

I liked Wootton’s book because it helped me understand something I’ve noticed ever since the Nobel-Prize winning psychologist Daniel Kahneman published Thinking, Fast and Slow in 2011. Kahneman’s book is about biases that distort judgment, how to identify them and what we can do to avoid them. In the traditional view, emotion is the enemy and people are thought to be generally rational, their thinking sound. Nearly four decades of decision-making research reveal a new perspective. Systematic biases not only undermine the idea that people are rational but they are largely invisible to us. We are “blind to our blindness,” Kahneman says.

The deeper lesson to glean from this body of research is not just that our decisions are occasionally flawed—we’ve known about our mental foibles since at least the Greeks—or even that the conscious mind is convinced that it’s not flawed. It’s that if the conscious mind functions like a press secretary, someone who does not seek the truth but justifies intuitive judgments and glosses over its own shortcomings and delusions, then we should be very careful when we read a book like Thinking, Fast and Slow. Although it’s easy to grasp the idea that people deviate from the standards of rationality, it’s much harder to resist the belief that reading about this research will automatically help us think clearer. Learning about judgment errors elicits a feeling of enlightenment, the sense that knowledge has been gained, that can further distort how we perceive the world.

We do not, in other words, absorb this body of research like a scientist studies the physical world. Ironically, we interpret decision-making mistakes in a way that makes us look good, often falling prey to the very biases that we are advised to avoid, such as overconfidence and the above-average effect. So while readers and students genuinely understand that they’re “biased” in a nominal sense, they conflate learning about how people make decisions with a real improvement in judgment. This is what happens when the object of inquiry is also the tool of inquiry, and when that tool is specifically designed to generate an unjustified sense of righteousness.

When I first noticed this paradox a few years ago, I had a hard time describing it. I used the clumsy term “confirmation bias bias” to describe how learning about decision-making mistakes engenders a myopic view of the mind, as if our persistent tendency to search for confirming evidence inevitably causes us to only view the mind in those same terms. Reading Wootton’s book helped me understand the broader insight. There is a difference between discovering something new, on the one hand, and that discovery changing the way we think, on the other. Much like Columbus discovered America without grasping the broader concept of “discovery,” the judgment and decision-making community has discovered new insights into how the mind makes decisions without considering how those insights affect the mind. Thinking, Fast and Slow is such an intriguing book because, by learning about the details of our intellectual hardware, we change them.

If learning about how our own mind screws up distorts judgment instead of improving it, then the question we should ask is not how the mind works, but what it means to have a mind in the first place. One irony of the scientific revolution is that we began to treat the mind as an object of scientific study, just like how Brahe and Newton treated novas and beams of light, even though unlike the physical world, it changes each time we examine and scrutinize it; that is, it changes because we examine and scrutinize it. And while we should rely on the scientific method to interpret everything in the natural world, we need to remember where that method was developed, and how that conflict of interest could lead us astray. As the writer and neurologist Robert Burton says, “Our brains possess involuntary mechanisms that make unbiased thought impossible yet create the illusion that we are rational creatures capable of fully understanding the mind created by these same mechanisms.”

So what is a mind? Nailing down that definition represents, I think, one of the central tasks of modern neuroscience. But, more importantly, it is a task that must inform cognitive psychology—if the goal is in fact to correct outdated assumptions about human nature.

The Invention of Science, a survey into how we made discoveries about the world and how those discoveries replaced the conceptual tools we used to perceive the world, is a lesson in intellectual humility. It’s a story about the persistent belief that we see the world as it is, on the one hand, and our willingness to test that belief, on the other. The purpose of this essay is to test the belief that you can use your mind to understand your mind, and I’d proceed cautiously if that test elicited a sense of enlightenment. We should expect nothing less from an organ that evolved to do just that.


Daniel Kahneman often says that despite forty years in academia, he still falls for the very same biases that his research has revealed. He is pessimistic, and it might seem from this article that I am, too.

Wootton’s book explained how we began to not only study the physical world but also recognize that we don’t see it objectively. That is, we began to study the physical world because we recognized that we don’t see it objectively. The fact that we’re talking about biases here in the 21st century suggests that something has changed in the last few hundreds years. We now dedicate large swaths of academic work to researching not only the physical world but also how reason can make us perceive the physical world incorrectly. The judgment and decision-making literature is a direct descendent of Descartes, who emphasized reason and reflection over sense experience. In an era where educated people believed in alchemy and witches but not germs, Descartes wanted to get people to improve their beliefs. We have.

More importantly, we’ve dramatically improved how we think, not just what we think. Consider Superforecasters, by Wharton psychologist Phil Tetlock. Tetlock is famous for publishing longitudinal study a decade ago that measured “expert prediction.” He found that the professional pundits invited to take part in his study performed no better than chance when they made long-term forecasts about the future of political events, wars, etc. Wharton’s new book documents an elite group of people, superforecasters, who have a remarkable track record of making accurate forecasts. They’re really good at answering questions like, “Will there be a terrorist attack in Europe between August 2016 and December 2016?” When Tetlock investigated what made these people so good, he did not find that they were smarter. He found that they possessed better mental tools, which is to say that they used basic probability and statistics to think about the future, avoided ideology, and encouraged dissonance. In short, they did a good job of correcting their biases.

I’ve become wary of judgment and decision-making, not the research but the way people talk about it and the way it is reported online and in print. I suppose you could say that even though the JDM community has done a tremendous job explaining how people actually make decisions, it has not fulfilled its promise to explain “how the mind works,” as so many subtitles seem to suggest. My impetus for writing the article was to recommend that the JDM community be mindful of meta-biases and remember their relatively small role in the broader cognitive science pie–and the fact that cognitive science is so young. We still know so little about the mind and the brain, perhaps an equivalent amount to what Columbus knew about the New World.

Ironically, it’s those catchy subtitles that got me into this fascinating body of knowledge in the first place.

Image via Flickr/Zoe Rimmer