Scivillage.com Casual Discussion Science Forum

Full Version: Why the smartest people can make the dumbest mistakes
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
https://www.popsci.com/the-intelligence-trap/

EXCERPT (from the The Intelligence Trap, David Robson): . . . While decades of psychological research have documented humanity’s more irrational tendencies, it is only relatively recently that scientists have started to measure how that irrationality varies between individuals, and whether that variance is related to measures of intelligence. They are finding that the two are far from perfectly correlated: it is possible to have a very high IQ or SAT score, while still performing badly on these new tests of rationality—a mismatch known as “dysrationalia.” Indeed, there are some situations in which intelligence and education may sometimes exaggerate and amplify your mistakes.

A true recognition of dysrationalia—and its potential for harm—has taken decades to blossom, but the roots of the idea can be found in the now legendary work of two Israeli researchers, Daniel Kahneman and Amos Tversky, who identified many cognitive biases and heuristics (quick-and-easy rules of thumb) that can skew our reasoning.

One of their most striking experiments asked participants to spin a “wheel of fortune,” which landed on a number between 1 and 100, before considering general knowledge questions—such as estimating the number of African countries that are represented in the UN. The wheel of fortune should, of course, have had no influence on their answers—but the effect was quite profound. The lower the quantity on the wheel, the smaller their estimate—the arbitrary value had planted a figure in their mind, “anchoring” their judgment.

You have probably fallen for anchoring yourself many times while shopping during sales. Suppose you are looking for a new TV. You had expected to pay around $150, but then you find a real bargain: a $300 item reduced to $200. Seeing the original price anchors your perception of what is an acceptable price to pay, meaning that you will go above your initial budget.

Other notable biases include framing (the fact that you may change your opinion based on the way information is phrased), the sunk cost fallacy (our reluctance to give up on a failing investment even if we will lose more trying to sustain it), and the gambler’s fallacy—the belief that if the roulette wheel has landed on black, it’s more likely the next time to land on red. The probability, of course, stays exactly the same.

Given these findings, many cognitive scientists divide our thinking into two categories: “system 1,” intuitive, automatic, “fast thinking” that may be prey to unconscious biases; and “system 2,” “slow,” more analytical, deliberative thinking. According to this view—called dual- process theory—many of our irrational decisions come when we rely too heavily on system 1, allowing those biases to muddy our judgment.

It is difficult to overestimate the influence of this work, but none of the early studies by Kahneman and Tversky had tested whether our irrationality varies from person to person. Are some people more susceptible to these biases, while others are immune, for instance? And how do those tendencies relate to our general intelligence? Conan Doyle’s story [Houdini, Atlantic City] is surprising because we intuitively expect more intelligent people, with their greater analytical minds, to act more rationally—but as Tversky and Kahneman had shown, our intuitions can be deceptive. If we want to understand why smart people do dumb things, these are vital questions.

During a sabbatical at the University of Cambridge in 1991, a Canadian psychologist called Keith Stanovich decided to address these issues head on. With a wife specializing in learning difficulties, he had long been interested in the ways that some mental abilities may lag behind others, and he suspected that rationality would be no different. The result was an influential paper introducing the idea of dysrationalia as a direct parallel to other disorders like dyslexia and dyscalculia.

It was a provocative concept—aimed as a nudge in the ribs to all the researchers examining bias. “I wanted to jolt the field into realizing that it had been ignoring individual differences,” Stanovich told me. Stanovich emphasizes that dysrationalia is not just limited to system 1 thinking. Even if we are reflective enough to detect when our intuitions are wrong, and override them, we may fail to use the right “mindware”—the knowledge and attitudes that should allow us to reason correctly. If you grow up among people who distrust scientists, for instance, you may develop a tendency to ignore empirical evidence, while putting your faith in unproven theories. Greater intelligence wouldn’t necessarily stop you forming those attitudes in the first place, and it is even possible that your greater capacity for learning might then cause you to accumulate more and more “facts” to support your views.

Stanovich has now spent more than two decades building on the concept of dysrationalia with a series of carefully controlled experiments. To understand his results, we need some basic statistical theory... (MORE - details)