Face it: Human beings are pretty stupid sometimes. We fall for astrology, vote for morons, buy into conspiracy theories, and send our money to obvious scammers. Today, all this seems to be getting worse as "fake news" and "disinformation" take root and the country fragments along partisan lines—not just when it comes to political beliefs but even when it comes to basic facts.
Oddly enough, though, we’re also stunningly brilliant. Not only have the brightest among us produced stunning feats of science and technology, but even normal people, working without modern tools, can use reason to improve their lives. The hunting-and-gathering that humans did long ago required extensive knowledge and careful thought about plants, seasons, animal behavior, and how humans might get what they want from the world around them.
That’s the tension animating Steven Pinker’s new book Rationality: What It Is, Why It Seems Scarce, Why It Matters. The tome celebrates reason, and it provides a digestible primer on the basics of logic, statistics, and other building blocks of rationality, illustrated with clever brain teasers and the occasional comic. Yet it also explains how the human mind can go wrong and asks why rationality seems to be in short supply these days.
This book does an excellent job explaining the basics of good thinking, though Pinker’s final musings about what ails the modern world are a bit unsatisfying.
Pinker’s thinking toolkit is laid out in a series of lucid chapters in the middle of the book. They explain the basics of formal logic: "If it rained, then the streets are wet" doesn’t imply "if the streets are wet, then it rained," for example, because something else might have made the streets wet. Pinker also goes through the core concepts of probability, explaining, for instance, why a 50 percent chance of rain on Saturday and a 50 percent chance of rain on Sunday don’t add up to a 100 percent chance of rain over the weekend.
He further introduces readers to "signal detection theory," the study of all the weird stuff that happens when we’re forced to grapple with false positives and false negatives. For instance, if a test for a disease is 95 percent accurate, but only 1 percent of people have the disease, a positive test will still suggest you probably don’t have the disease: In every 100 people tested, about one will correctly receive a positive result, but about five will receive false positives. Unless, that is, there are other reasons to think your risk might be higher than a random person’s—in which case we might use "Bayesian" reasoning to integrate the results of the test into everything else we know. Then it’s on to game theory, such as the famous Prisoner’s Dilemma, and social science’s never-ending quest to distinguish causation from mere correlation. In the latter, I was especially impressed by Pinker’s explanation of how regression analysis works, which manages to be simple and easy to follow while capturing some of the more technical details.
All of these tools are incredibly important for those who evaluate scientific studies, try to convince people with logical arguments and sound evidence, or even just bicker with family over politics. They introduce and enforce a rigor to one’s thinking by highlighting common errors and formalizing the best ways to interpret evidence.
But applying them to actual arguments gets messy in a hurry and is rarely sufficient to resolve a genuine dispute, which is why even experts so often disagree with each other. Pinker discusses numerous reasons for this throughout the book. The categories of things we talk about are "fuzzy" around the edges, for example, making them hard to slot into logical formulas with simple true-or-false binaries. And logical truths are not the same thing as empirical truths anyway—you can’t reason from premises to conclusions without first establishing sound premises by observing the real world. When doing that, we might disagree about how much weight to place on different kinds of evidence. And then, even if we agree on the facts, we might disagree on how much value to place on different risks and benefits—the inescapable moral dimension to all this.
To the extent we do invoke the principles of rationality in everyday debates, we often rely on "informal" logical fallacies—which, as typically used in practice, are far more subjective (and in my experience obnoxious) than Pinker lets on in his discussion of the issue. Have you drawn a "false equivalence," or was that a sound comparison? Have I relied on a "slippery slope fallacy," or could we be talking about a slope that’s actually, you know, slippery?
At any rate, when Pinker gets around to his penultimate chapter, in which he tries to explain "why humanity appears to be losing its mind," he sheepishly admits that "the inventory of logical and statistical fallacies explained in the preceding chapters" probably aren’t what’s driving the problem. "Nothing from the cognitive psychology lab," he adds, "could have predicted QAnon."
In this chapter, though, it’s not entirely clear whether Pinker is trying to explain why humans are irrational in general, or why things seem to have gotten worse. The aforementioned "losing its mind" and "QAnon" lines would suggest the latter, as would numerous other comments Pinker makes: There’s a "pandemic of poppycock," for example, exemplified in part by fake news and COVID conspiracy theories, part of what "some are calling an ‘epistemological crisis.’"
But then Pinker turns around and downplays the importance of social media, writing that while new technology may be accelerating the spread of bad thinking, such thinking has been with us forever, and that by some measures, conspiracy theories haven’t become more popular in recent decades. This leads the reader to scratch his head: If we’re asking why the world is "losing its mind," don’t we need a variable that’s changed recently? And if we don’t think things have actually gotten worse, what’s with all the insinuations to the contrary?
When it comes to other theories that could explain a change, Pinker mostly just lists them without exploring them in detail: the fractionalization of the media, the rise of gerrymandering, the growing self-segregation of urban liberals, the decline of civil society, etc. Beyond that, he largely sticks to common explanations of why humans are irrational in general, such as motivated reasoning and tribalistic "myside" bias. Which is fine but hardly revelatory.
This chapter could have focused more on the longer-term decline of Americans’ trust in each other and their institutions as well as the very real failures of the country’s expert class, including the serial mishandling of COVID-19. (Pinker’s discussion here dwells mainly on academia and the press in particular.) A productive argument requires at least some underlying agreement on the key premises, and such agreement fades when we don’t—or can’t—trust the folks who are supposed to nail down the major facts and run the country in a way that’s in line with our values. Some of Pinker’s proposed solutions, including more scientists in Congress and a media more willing to "fact-check," are likely to fail in this environment.
We should all aspire to think clearly, and most of Rationality is admirably dedicated to showing us how. That makes it worth reading, even if the book stumbles in diagnosing the modern world’s problems.
Rationality: What It Is, Why It Seems Scarce, Why It Matters
by Steven Pinker
Viking, 432 pages, $32
Robert VerBruggen is a fellow at the Manhattan Institute.
Published under: Book reviews