Commonplace
Rationality: What It Is, Why It Seems Scarce, Why It Matters
Steven Pinker, 2021
Collected excerpts, snippets, and things of interest from Rationality by Steven Pinker.
Nicholas Molina
May 20, 2025
Why I read this book
I don’t remember exactly what prompted me to pick up Rationality. It’s been a while. Most likely I saw Steven Pinker promoting it on Real Time with Bill Maher. I’d heard good things about his earlier books, and I’m generally drawn to work about thinking—how we reason, where we go wrong, and what it would look like to do better. A book explicitly about rationality sounded like a safe bet.
Should you read this book?
Perhaps.
It’s a good book, but an overly ambitious one. Its main strength—trying to gather the many strands of rationality into a single volume—is also its weakness. Entire disciplines like logic, probability, and game theory are each given a chapter. The explanations are clear, but the compression is unavoidable. Big, difficult subjects end up feeling rushed.
The book is strongest when it’s making the case for rationality rather than explaining it. I most enjoyed the sections at the intersection of rationality and morality, including discussions of taboos like forbidden base rates and heretical counterfactuals. Pinker’s treatment of the reproducibility crisis was also refreshingly candid. If you’re already familiar with the basics and want a broad, thoughtful synthesis, it’s worth reading. If you’re looking for depth in any one area, you may be better served elsewhere.
Commonplace
Confirmation bias: the bad habit of seeking evidence that ratifies a belief and being incurious about evidence that might falsify it.
Intuitive probability is driven by imaginability: the easier something is to visualize, the likelier it seems.
[Tversky and Kahneman argued people’s intuitive sense of probability is] driven by representative stereotypes and available memories rather than on systematic reckoning of possibilities.
Kahneman observed that humans are never so irrational as when protecting their pet ideas.
As excellent as our visual systems are, rational pilots know when to discount them and turn their perception over to instruments. And as excellent as our cognitive systems are, in the modern world we must know when to discount them and turn our reasoning over to instruments—the tools of logic, probability, and critical thinking that extend our powers of reason beyond what nature gave us.
A rational argument… must attain that goal not by doing something that just happens to work there and then, but by using whatever knowledge is applicable to the circumstances.
“Romeo wants Juliet as the filings want the magnet; and if no obstacles intervene he moves toward her by as straight a line as they. But Romeo and Juliet, if a wall be built between them, do not remain idiotically pressing their faces against its opposite sides like the magnet and the filings with the card. Romeo soon finds a circuitous way, by scaling the wall or otherwise, of touching Juliet’s lips directly. With the filings the path is fixed; whether it reaches the end depends on accidents. With the lover it is the end which is fixed; the path may be modified indefinitely.”
— William James
“What the Tortoise Said to Achilles.” Lewis Carroll, 1895
The very fact of interrogating the concept of reason using reason presupposes the validity of reason. Because of this unconventionality, it’s not quite right to say that we should “believe in” reason or “have faith in” reason. As Nagel points out, that’s “one thought too many.” The masons (and the Masons) got it right: we should follow reason.
Disagreement is necessary in deliberations among mortals. As the saying goes, the more we disagree, the more chance there is that at least one of us is right.
Reason is the means to an end, and cannot tell you what the end should be, or even that you must pursue it.
The evolutionary process selects for genes that lead organisms to have as many surviving offspring as possible in the kinds of environments in which their ancestors lived. They do so by giving us motives like hunger, love, fear, comfort, sex, power, and status. Evolutionary psychologists call these motives “proximate,” meaning that they enter into our conscious experience, and we deliberately try to carry them out. They can be contrasted with the “ultimate” motives of survival and reproduction, which are the figurative goals of our genes—what they would say they wanted if they could talk.
Life is a never-ending gauntlet of marshmallow tests, dilemmas that force us to choose between a sooner small reward and a later large reward.
Philip Tetlock’s three taboos:
- Forbidden base rate – arises from the fact that no two groups of people—men and women, Blacks and whites…—have identical averages on any trait one cares to measure.
- Taboo tradeoff – resources are finite in life and tradeoffs unavoidable. … Putting a dollar value on a human life is repugnant, but it’s also unavoidable.
- Heretical counterfactual – Built into rationality is the ability to ponder what would happen if some circumstances were not true…
“If your belief system contains a contradiction, you can believe anything.”
“Consistency is the hobgoblin of little minds” — Emerson
Informal fallacies:
- Straw man – Replace opponent proposition with one that is easier to attack
- Special pleading – Replace your own proposition with one that is easier to defend
- Motte-and-bailey – Move the goalposts, retreat to an easier-to-defend position
- No true Scotsman – Redefine to exclude counterexamples
Guilt by association (errors of example): … Lifesaving knowledge in public health, including the carcinogenicity of tobacco smoke, was originally discovered by Nazi scientists, and tobacco companies were all too happy to reject the smoking–cancer link because it was “Nazi science.”
In large swaths of academia and journalism the fallacies are applied with gusto, with ideas attacked or suppressed because their proponents, sometimes from centuries past, bear unpleasant odors and strains. It reflects a shift in one’s conception of the nature of beliefs: from ideas that may be true or false to expressions of a person’s moral and cultural identity.
With the fallacies, one is surrendering to feelings that have no bearing on the truth of the claim.
One reason logic will never rule the world is the fundamental distinction between logical propositions and empirical ones, which Hume called “relations of ideas” and “matters of fact,” and philosophers call analytic and synthetic.
It’s often said that the Scientific Revolution of the seventeenth century was launched when people first appreciated that statements about the physical world are empirical and can be established only by observation, not scholastic argumentation.
The contrast between the ecological rationality that allows us to thrive in a natural environment and the logical rationality demanded by formal systems is one of the defining features of modernity.
Most of our everyday concepts turn out to be family resemblance categories, not the “classical” or “Aristotelian” categories that are easily stipulated in logic.
Neural networks thus provide clues about the portion of human cognition that is rational but not, technically speaking, logical.
Human rationality is a hybrid system. The brain contains pattern associations that soak up family resemblances and aggregate large numbers of statistical clues. But it also contains a logical symbol manipulator that can assemble concepts into propositions and draw out their implications. Call it System 2, or recursive cognition, or rule-based reasoning.
An essential part of rationality is dealing with randomness in our lives and uncertainty in our knowledge.
Mistaking a nonrandom pattern for a nonrandom process is one of the thickest chapters in the annals of human folly, and knowing the difference between them is one of the greatest gifts of rationality that education can confer.
Paul Slovic… showed that people also overestimate the danger from threats that are novel, out of their control, human-made, and inequitable.
Body-counting data scientists are often perplexed at the way that highly publicized but low-casualty killings can lead to epochal societal reactions.
[The George Floyd protests] were driven in part by the impression that African Americans are at serious risk of being killed by the police. Yet as with terrorism and school shootings, the numbers are surprising. A total of 65 unarmed Americans of all races are killed by the police in an average year, of which 23 are African American, which is around three tenths of one percent of the 7,500 African American homicide victims.
A communal outrage inspires what the psychologist Roy Baumeister calls a victim narrative: a moralized allegory in which a harmful act is sanctified, the damage consecrated as irreparable and unforgivable.
The press is an availability machine… Since news is what happens, not what doesn’t happen, the denominator in the fraction corresponding to the true probability of an event—all the opportunities for the event to occur, including those in which it doesn’t—is invisible, leaving us in the dark about how prevalent something really is.
News sites could have run the headline 137,000 People Escaped Poverty Yesterday every day for the past twenty-five years.
Availability-driven ignorance can be corrosive.
Journalists should put lurid events in context. A killing or plane crash or shark attack should be accompanied by the annual rate, which takes into account the denominator of the probability, not just the numerator.
Once we expect a pattern, we seek out examples and ignore the counterexamples.
[Start a mailing list with 100,000 people and send two predictions of the market. Every quarter, cull the people you sent the wrong predictions to. After two years you’d have 1,562 people amazed at your track record of predicting eight quarters in a row.] This scam is illegal if carried out knowingly; when it’s carried out naively it’s the lifeblood of the finance industry.
[Bayesian reasoning translated to common sense works like this.] Now that you’ve seen the evidence, how much should you believe the idea? First, believe it more if the idea was well supported, credible, or plausible to start with… Second, believe the idea more if the evidence is especially likely to occur when the idea is true… And third, believe it less if the evidence is commonplace.
[The major ineptitude in our Bayesian reasoning: we neglect the base rate, which is usually the best estimate of the prior probability.]
[On the replication crisis] If a preposterous claim could get published in a prestigious journal by an eminent psychologist using state-of-the-art methods subjected to rigorous peer review, what does that say about our standards of prestige, eminence, rigor, and the state of the art?
The most notorious replication failures come from studies that attracted attention because their findings were so counterintuitive.
[John Ioannidis, “Why Most Published Research Findings Are False,” 2005] A big problem is that many of the phenomena that biomedical researchers hunt for are interesting and a priori unlikely to be true, requiring highly sensitive methods to avoid false positives, while many true findings, including successful replication attempts and null results, are considered too boring to publish.
The stage for forbidden base rates is set by a law of social science. Measure any socially significant variable: test scores, vocational interests, social trust, income, marriage rates, life habits, rates of different types of violence. Now break down the results by the standard demographic dividers: age, sex, race, religion, ethnicity. The averages for the different subgroups are never the same, and sometimes the differences are large. Whether the differences arise from nature, culture, discrimination, history, or some combination is beside the point: the differences are there.
If an ethnic group or a sex has been disadvantaged by oppression in the past, its members may be saddled with different average traits in the present. If those base rates are fed into predictive formulas that determine their fate going forward, they would lock in those disadvantages forever.
The more specific the reference class, of course, the better—but the more specific the reference class, the smaller the sample on which the estimate is based, and the noisier the estimate.
Another problem with using a base rate as the prior is that base rates can change, and sometimes quickly.
Risk… may be distinguished from uncertainty, where the decider doesn’t even know the probabilities and all bets are off.
Theories of rational choice assume an angelic knower with perfect information and unlimited time and memory. For mortal deciders, uncertainty in the odds and payoffs, and the costs of obtaining and processing the information, have to be factored into the decision… a flesh-and-blood decider rarely has the luxury of optimizing but instead must satisfice.
The emotions triggered by possibility and certainty add an extra ingredient to chance-laden choices like insurance and gambling which cannot be explained by the shapes of the utility curves.
[Prospect theory] a loss is more than twice as painful as the equivalent gain is pleasurable.
For every thousand women who undergo annual ultrasound exams for ovarian cancer, 6 are correctly diagnosed with the disease, compared with 5 in a thousand unscreened women—and the number of deaths in the two groups is the same, 3. What about the costs? Out of the thousand who are screened, another 94 get terrifying false alarms, 31 of whom suffer unnecessary removal of their ovaries, of whom 5 have serious complications to boot. The number of false alarms and unnecessary surgeries among women who are not screened, of course, is zero. It doesn’t take a lot of math to show that the expected utility of ovarian cancer screening is negative. The same is true for men when it comes to screening for prostate cancer with the prostate-specific antigen test.
The signal detection challenge is whether to treat some indicator as a genuine signal from the world or as noise in our imperfect perception of it… This sounds obvious, but confusing response bias with accuracy by looking only at the signals or only at the noise is a surprisingly common fallacy.
There is no “correct” answer to… questions of moral valuation, but we can use signal detection thinking to ascertain whether our practices are consistent with our values.
If less-than-omniscient humans are to have a system of justice at all, they must face up to the grim necessity that some innocents will be punished. But being mindful of the tragic tradeoffs in distinguishing signals from noise can bring greater justice.
Most social scientists are so steeped in the ritual of significance testing, starting so early in their careers, that they have forgotten its actual logic.
The logic of Prisoner’s Dilemmas and Public Goods undermines anarchism and radical libertarianism, despite the eternal appeal of unfettered freedom. The logic makes it rational to say, “There ought to be a law against what I’m doing.”
Regression to the mean is purely a statistical phenomenon, a consequence of the fact that in bell-shaped distributions, the more extreme a value, the less likely it is to turn up. That implies that when a value is really extreme, any other variable that is paired with it (such as the child of an outsize couple) is unlikely to live up to its weirdness, or duplicate its winning streak, or get dealt the same lucky hand, or suffer from the same run of bad luck, or weather the same perfect storm year again, and will backslide toward ordinariness.
People’s attention gets drawn to an event because it is unusual and they fail to anticipate that anything associated with that event will probably not be quite as unusual as that event was. Instead, they come up with fallacious causal explanations for what in fact is a statistical inevitability.
A tragic example is the illusion that criticism works better than praise, and punishment better than reward. We criticize students when they perform badly. But whatever bad luck cursed that performance is unlikely to be repeated in the next attempt, so they’re bound to improve, tricking us into thinking that punishment works. We praise them when they do well, but lightning doesn’t strike twice, so they’re unlikely to match that feat the next time, fooling us into thinking that praise is counterproductive.
The Winner’s Curse applies to any unusually successful human venture, and our failure to compensate for singular moments of good fortune may be one of the reasons that life so often brings disappointment.
No event has a single cause. Events are embedded in a network of causes that trigger, enable, inhibit, prevent, and supercharge one another in linked and branching pathways.
When A is correlated with B, it could mean that A causes B, B causes A, or some third factor, C, causes both A and B.
[The human expert] is far too impressed with the eye-catching particulars and too quick to throw the base rates out the window. Indeed, some of the predictors that human experts rely on the most, such as face-to-face interviews, are revealed by regression analysis to be perfectly useless.
The mustering of rhetorical resources to drive an argument toward a favored conclusion is called motivated reasoning… In biased evaluations, we deploy our ingenuity to upvote the arguments that support our position and pick nits in the ones that refute it.
In study after study, liberals and conservatives accept or reject the same scientific conclusion depending on whether or not it supports their talking points, and they endorse or oppose the same policy depending on whether it was proposed by a Democrat or a Republican [this is called myside bias].
A team of social scientists concluded that the sides are less like literal tribes, which are held together by kinship, than religious sects, which are held together by faith in their moral superiority and contempt for opposing sects.
The problem in justifying motivated reasoning with Bayesian priors is the prior often reflects what the reasoner wants to be true rather than what he or she has grounds for believing is true.
The human mind is adapted to understanding remote spheres of existence through a mythology mindset… Submitting all of one’s beliefs to the trials of reason and evidence is an unnatural skill, like literacy and numeracy, and must be instilled and cultivated.
Bona fide science communicators must shoulder some of the blame for failing to equip people with the true understanding that would make pseudoscience incredible on the face of it. Science is often presented in schools and museums as just another form of occult magic.
Perhaps surprisingly, when [rumors] circulate among people with a vested interest in their content, such as within workplaces, they are usually correct… The problem is that social and mass media allow rumors to spread through networks of people who have no stake in their truth… originators and spreaders suffer no reputational damage for being wrong. Without these veracity checks, social media rumors, unlike workplace rumors, are incorrect more often than correct.
Legislatures are largely populated by lawyers, whose professional goal is victory rather than truth.
Each of us has a motive to prefer our truth, but together we’re better off with the truth.
Our powers of reason are guided by our motives and limited by our points of view.
Distrust false dichotomies and single-cause explanations
We discount our future myopically, but it always arrives, minus the large rewards we sacrificed for the quick high. We try to recoup sunk costs, and so stay too long in bad investments, bad movies, and bad relationships. We assess danger by availability, and so avoid safe planes for dangerous cars, which we drive while texting. We misunderstand regression to the mean, and so pursue illusory explanations for successes and failures.
Intelligence is not the same thing as rationality, since being good at computing something is no guarantee that a person will try to compute the right things.
We should care about people’s virtue when considering them as friends, but not when considering the ideas they voice. Ideas are true or false, consistent or contradictory, conducive to human welfare or not, regardless of who thinks of them.