The Myth of Perfect Information


Solomon Asch

I want to tell you about some research that will change the way you think about thinking.

Imagine you’re about to interview someone for an important job. Your colleague informs you the candidate is intelligent, industrious, impulsive, critical, stubborn, and envious. You might picture someone who knows what he wants. He might be occasionally impatient and forceful, but he is hard working and ambitious. He puts his intelligence to good use.

Now imagine you’re about to interview someone else for the same job. This time, your colleague tells you the candidate is envious, stubborn, critical, impulsive, industrious, and intelligent. You might picture someone with a “problem.” Although he is intelligent, the candidate is prone to moments of rage and jealousy. His bad qualities will surely overshadow his lighter side.

In 1946, the American psychologist Solomon Asch gathered 58 participants and split them into two groups. The first group read about a person who was intelligent, industrious, impulsive, critical, stubborn, and envious. The second group read about the same person but with a twist. When Asch reversed the order of the qualities, participants imagined an entirely different person. Some qualities that people in the first group perceived as positive (impulsive and critical) were perceived as negative.

Asch was not the first person to notice that we make unreliable snap judgments based on limited information. Just about every philosopher and writer has commented on our malleable social intuitions.

Asch was one of the first scientists to empirically show that there is no such thing as neutral information. Even though his experiment revealed a quirk in how we evaluate other people—the study was published in a journal dedicated to social psychology—his findings apply to nearly every aspect of life. How information is ordered and how it is framed will invariably influence our judgment one way or another.

For instance, we tend to judge the length of a bike ride from Maine to Florida as shorter than the length of a bike ride from Florida to Maine, as if gravity helps us on the way down. We’re more likely to order expensive beer when it is placed next to lite beer, yet we’re more partial to lite beer when it is placed next to a “premium” cheap beer. A $60,000 salary feels different in a company where everyone makes $80,000 versus $40,000. If I tell you a painkiller costs $2.50, it will reduce pain more than if I told you it cost $0.10; how effective medicine will be critically depends on how effective you think the medicine is.

Even when we process a single piece of information—imagine someone only telling you a candidate is “intelligent” or having only one beer to select from—the information will not be neutral. Without other reference points, we’ll evaluate the same trait or price differently.

It’s worth pausing to appreciate this insight. In Thinking, Fast and Slow, Daniel Kahneman discusses cognitive biases as they relate to the economic standard of rationality. In this view, a bias is a deviation. It’s what happens in the checkout lane, on a trading floor, or during fourth down.

The implication of Asch’s study is that the idea of a neutral choice environment, in which the layout of a menu or the font of an email does not sway the reader one way or another, is a myth. In this view, no matter how hard you flex your cognitive muscles, you will never process information without distorting it, not just because the mind is biased, but because the information is biased as well.

The lesson for anybody who depends on customers should be obvious. Be mindful how you present the facts; they will nudge customers in some way. Williams-Sonoma once boosted the sales of a $279 breadmaker simply by placing it next to a somewhat bigger model priced at $429. We’re more likely to buy a $200 printer with a $25 rebate than the same printer priced at $175. Despite what you heard in economics class, consumers really don’t know what most goods should cost.

The second lesson is for everyone else. If you’re still wondering if there is such a thing as neutral information, good. The moral of Thinking, Fast and Slow and every other book in that aisle is not that we occasionally mess up. It’s that a dose of epistemic humility can go a long way.

The surreptitious part of the human brain is that we think we see the world as it is. It’s almost as if the brain and the mind have a contractual relationship, in which the mind has agreed to believe the worldview the brain creates, but in return the brain has agreed to create a worldview the mind wants. La Rochefoucauld was right: “Nothing can comfort us when we are deceived by our enemies and betrayed by our friends; yet we are often happy to be deceived and betrayed by ourselves.”