What is Rationality?


The Story of Intel

Semiconductor giant Intel was originally a memory chip manufacturer. But by 1985, memory chips had been losing them money for years. Andy Grove (CEO) and Gordon Moore met to discuss the problem. The mood was grim. At one point, Andy turned to Gordon and asked, “If we get kicked out and the board brings in a new CEO, what do you think he would do?”
Gordon replied without hesitation. “He would get us out of the memory business.”
“Okay,” said Andy. “Then why shouldn’t you and I walk out the door, come back, and do it ourselves?”
That year, Andy and Gordon shifted the focus of the company to microprocessors, and created one of the greatest success stories in American business.
Andy and Gordon had been held hostage by a well-documented cognitive bias known as the “commitment effect” — in this case, an irrational commitment to their past identity as a memory chip manufacturer. They overcame this bias, and made the winning decision, by employing a specific mental habit: that of looking at a problem as if you are an outside party to it.
During the last 40 years, psychologists have discovered that even high-IQ people like Andy Grove and Gordon Moore are prone to certain kinds of errors when thinking and deciding. Fortunately, scientists have also discovered particular mental habits that can help us avoid these mistakes, and thereby make better decisions. The Center for Applied Rationality exists both to teach these “rationality skills” to students and to advance humanity’s scientific understanding of how to improve human judgement and decision-making.

What Rationality Is Not

For some people, the word “rationality” is associated with being judgmental, with not having emotions, with never relying on intuition, or with valuing only quantifiable things like money and ignoring qualitative things like happiness or love — like Mr. Spock from Star Trek.
But that’s not what a cognitive scientist means when he or she uses the word “rationality.” In the scientist’s sense of “rationality,” a person of above-average rationality will still have emotions, will still value happiness and love, will often rely on intuition, and need not be judgmental.
That’s because “rationality” has a technical meaning in cognitive science, one that doesn’t have anything to do with Mr. Spock.

Rationality = Logic + Probability Theory + Rational Choice Theory

For a cognitive scientist, “rationality” is defined in terms of what a perfect reasoner would look like. Humans aren’t perfect reasoners — not even close! — but knowing what a perfect reasoner would look like can help us understand what constitutes an “improvement” to our thinking and decision making.
First, a perfect reasoner wouldn’t believe two contradictory things at the same time. (That is, its beliefs would obey the laws of logic.)
Second, a perfect reasoner wouldn’t have contradictions in its degrees of belief, in how confident it is about various things. (That is, its degrees of belief would obey the laws of probability theory.) For example, humans often think that a story is more plausible as details are added to it, but a perfect reasoner would know that a story always becomes less plausible as details are added.
Third, a perfect reasoner wouldn’t make choices that are inconsistent with its own beliefs and desires. (That is, its choices would obey the laws of rational choice theory.) Unfortunately, humans often violate the laws of rational choice theory. For example we may believe that staying on our diet will bring us what we desire (better health, a slimmer body), and yet we still fail to follow our diets!
Applied Rationality is about doing the best solve problems like these, using the best knowledge we can find about how our minds really work and how to change them.

What is “Applied Rationality”?

Humans can’t be perfect reasoners because our brains have limited processing capacity, and because our brains are evolved kluges of “spaghetti code” rather than optimally-designed reasoning machines.

We commit reasoning errors so often that psychologists have given names to many of the most common errors: names like “confirmation bias” and “conjunction fallacy.”

The good news is that discovering how the brain produces these errors has given us some guidance in figuring out how we can make errors less often. We can’t become perfect reasoners, but we can certainly improve. By learning and practicing useful mental habits like “look at the problem as if you are an outside party to it,” we can become better-decision makers, and thereby achieve our goals more often — at school, at work, in our relationships, and in all of life.