Why How We Explain Intelligence Matters

Photo

In Divine Fury, Darrin McMahon chronicles the intellectual history of genius, from the ancient Greeks to modernity. In one of the earliest discussions of genius, Socrates argues that the greatest poets and rhapsodists are not uniquely talented but conduits who receive special “in-breathing,” or inspiration, from the Muses. We didn’t know how to explain creative prowess so we outsourced it to the gods.

As religion waned during the Enlightenment, the idea that genius arrived from an outside source lost its influence. The genius still possessed something special, but it emerged from the mind, and not through divine intervention.

Today geniuses are ubiquitous. David Shenk’s The Genius in All of Us and Malcolm Gladwell’s Outliers suggest that eminent success in art, athletics, and business is not reserved for the genetically endowed but correlates with thousands of hours of deliberate practice. “Once genius was born,” McMahon writes. “Now it is (self)-made.”

McMahon’s illustrious history nicely tracks how the perception of genius shifted from the otherworldly to the terrestrial and eventually the cognitive—some neuroscientists are even trying to explain intelligence neuron by neuron—yet it omits one of the most important developments in the science of intelligence.

 

In the last few decades, psychologists have not only begun to reveal how intelligence actually works; they’ve also demonstrated that beliefs about intelligence influence performance. In this view, how well we perform in school or at work is not just a function of how smart we are, but how smart we think we are.

Take, for example, the story of Kewauna Lerma, which I discovered in Paul Tough’s compelling book on education How Children Succeed. Kewauna was born into poverty on the South Side of Chicago. Having never learned to read well, she struggled in school.

The family hopscotched around the Midwest, from Mississippi to Minnesota, and back to Chicago. Despite each move—or perhaps because of them—Kewauna fell further behind in school. In sixth grade she collected seventy-two referrals for poor behavior by the middle of the year. She was assigned to the slow class where she was essentially “warehoused.” A few weeks before the end of the year, she was kicked out for fighting.

Before Tough wrote How Children Succeed, he had been reporting on how children grew up in poverty. Sadly, he was familiar with Kewauna’s story because he had heard it so many times. “Every unhappy family may be unhappy in its own way,” he writes, “but in families that stay trapped in poverty for generations, the patterns can become depressingly familiar, a seemingly endless cycle of absent or neglectful parents, malfunctioning schools, and bad decisions.”

Kewauna managed to escape the cycle. Before her sophomore year of high school, her mother intervened. She pleaded with Kewauna to not “end up like me.” She begged her to avoid unplanned pregnancies, to make going to college and having a career a priority. Most of all, she wanted Kewauna to succeed in school.

In the beginning of How Children Succeed, Tough discusses what he terms the cognitive hypothesis—the idea that what matters most for children early on is how much information they can stuff into their brains. The more a child is deprived of intellectual stimulation, the data shows, the less he will want to learn and succeed later in life.

While the stark calculus behind the cognitive hypothesis is entirely valid—more general knowledge equals smarter brain—cramming the mind with facts is only part of the story. For starters, it doesn’t explain Kewauna’s unexpected turnaround. Her GPA jumped from 1.8 to 3.4 between her first and second year of high school. As a junior, she was a straight-A student enrolled in four honors courses. She had not acquired more knowledge or gained IQ points, yet in nearly one year she went from failing to flourishing. What happened?

 

Kewauna possessed two things her mother never had. The first was support. Nearly every student born into rough circumstances who succeeded later in life received assistance with noncognitive traits: avoiding distraction, managing emotion, being persistence. These self-regulatory skills are hugely important and can be developed with help from a conscientious mentor.

Kewauna also learned how to handle failure. When her mother, Marla McConico, scored poorly on the ACT in the late 1980s, she gave up on her academic career. “After I got those scores back, I felt so like a failure,” she told Tough. Kewauna revived her academic career by learning from her setbacks instead of attributing them to poor intelligence and quitting. Unlike her mother, she knew obstacles could be advantages in disguise.

Kewauna’s breakthrough nicely supports research showing that how students perceive intelligence correlates with how well they persevere through adversity. Across a swath of studies, Carol Dweck has demonstrated that students are more persistent when they believe that talent and ability are developed through hard work. Students who believe that intelligence is immutable tend to give up on difficult problems easier, even though numerous twin studies show that intelligence is highly heritable. Dweck’s research is a testament to the power of perception.

Noncognitive abilities like grit, resilience, hope, support, mindset, and optimism correlate with success. I can’t help but wonder about the millions of people who, by virtue of living in an era that explained performance based on the Muses or a fixed theory of intelligence, never got close to their potential. Pure brain power certainty matters—IQ is a good predictor of success later in life—but when people believe that greatness in the arts, sciences, and business is solely a function of inborn talent, they are less likely to develop their skills. This outdated model was the default way of thinking for millennia. The sad truth is that we’re still living with it.

 

Sign up for the bi-weekly newsletter here.

Image via Flickr/Dan Foy