Steven Pinker’s new book on rationality came out today. I figured someone on LessWrong would write a review for it, so I might as well be the one to do it.
Unlike Pinker’s prior books, such as The Blank Slate and The Better Angels of Our Nature, this book lacks a straightforward empirical thesis. Instead, he mirrors the sequences by building a science of rationality and then tries to convince the reader that rationality is important, both personally and socially.
Unfortunately, long-time readers of LessWrong are unlikely to learn much from Pinker’s new book; his content is too similar to the content in the sequences. An upside is that Pinker’s treatment is more concise, and his style more closely resembles mainstream thought. Consequently, I’m tempted to recommend this book to people who might otherwise be turned away by Rationality: From A to Z.
He starts by asking a simple question: how come it seems like everyone is so irrational? Pointing to religion, conspiracy theorists, ghost-believers, anti-vaxxers, alternative medicine adherents, and postmodernists, Pinker makes a good case that there’s a lot of irrationality in the world. On the other hand, he continues, shouldn’t humans have evolved to be more rational? How could such persistent, widespread irrationality be so common in humans, if our survival impinges on our ability to reason?
Pinker provides a simple answer: humans are very rational animals, just not in every domain. In those domains on which our survival depended, such as finding and eating food, humans are much less clueless than you might have been lead to believe. Pinker provides the example of the San people of the Kalahari Desert in southern Africa, who, despite their mythological beliefs, are stunningly successful at hunting prey. He cites Louis Liebenberg, who documented how the San people use Bayesian reasoning to hunt, applying it to footprints and animal droppings in order to build an accurate picture of their natural world: a dry desert on which they have subsisted for over a hundred thousand years.
It’s not hard to see this dual phenomenon of rationality and irrationality reflected in the modern day: many young Earth creationists believe that the moon’s craters were literally planted by God to give the appearance of old age, but these same people rarely apply the same standards of reason to matters in their ordinary life.
Yet, as Pinker observes, sometimes even when our life and money does depend on our rationality, we still fail. For instance, most people consistently fail to save for retirement. Why? The answer here is simple: life today is a lot different than the lives of our ancestors. What might have been a threat 10,000 years ago—such as a tiger in the bushes—is no longer a major threat; conversely, some threats—like car crashes—are entirely new, and thus, the human brain is ill-equipped to evaluate them rationally.
Pinker’s book proceeds by presenting a textbook view of the science of rationality, including cognitive biases, formal logic, Bayesian inference, correlation and causation, statistical decision theory, and game theory. There isn’t much to complain about here: Pinker is a great writer, and presents these ideas with impressive clarity. However, the content in these chapters rarely departs from the mainstream exposition of these subjects. Given that I already knew most of the details, I was left a tad bored.
To prevent you from being bored as well, I won’t summarize the book’s main contents. (You can go and read his book if you want to know all the details.) Instead, I’ll draw my attention to some parts I liked, and some parts I didn’t like as much.
What I liked
First off, Pinker cited the rationalist community as an example of a group of good reasoners,
A heartening exception to the disdain for reason in so much of our online discourse is the rise of a “Rationality Community,” whose members strive to be “less wrong” by compensating for their cognitive biases and embracing standards of critical thinking and epistemic humility.
I was pleasantly surprised to see such a positive review of this community, given his previous assessment of AI risk arguments from LessWrong, which he bizarrely conflated with unfounded fears of the “Robopocalypse”. Perhaps this statement, and his semi-recent interaction with Stuart Russell is evidence that he is changing his mind.
In this book, Pinker demonstrates that he can be a good summarizer of other people. While this book contains almost no novel research, his skillfully ties together a ton of experiments in behavioral economics, theoretical models of rationality, and even signal detection theory. At the same time, it also seemed like the right length for a book trying to explain the basics of rationality, striking a nice balance between detail and book length. Gone are the days of needing to send someone a 1000+ page book to get them started on the whole “rationality” thing.
At no point did I feel that he was simply pushing an agenda. To be sure, at points, he drew from politics, religion, and personal experience to illustrate some aspect of irrationality, but these examples were secondary, as a way of making the ideas concrete; they were not his main focus.
Compared to his previous work, this book isn’t likely to get him into hot water. Nearly everything, besides a few of his examples of irrationality, are part of the standard consensus in cognitive science and statistics. The most controversial chapter is probably chapter 10 in which he explains myside bias, building on the recent work of Keith E. Stanovich. That said, given that his examples are broad and varied—criticizing dogmas on both the left and right—it’s not hard to see how some people might feel “the book isn’t for me.”
His book was not a mere recitation of biases or fallacies either: it emphasized what I view as a core principle of rationality, of actually taking your beliefs seriously, and acting on those beliefs. He refers to taking your beliefs seriously as “the reality mindset” and contrasts it with the “mythology mindset.” Many of us on LessWrong will know that this psychological dichotomy sometimes goes by other names, such as “near and far view” and “separate non-overlapping magisteria”. Pinker explains the essence of the mythology mindset,
[It] is the world beyond immediate experience: the distant past, the unknowable future, faraway peoples and places, remote corridors of power, the microscopic, the cosmic, the counterfactual, the metaphysical. People may entertain notions about what happens in these zones, but they have no way of finding out, and anyway it makes no discernible difference to their lives. Beliefs in these zones are narratives, which may be entertaining or inspiring or morally edifying. Whether they are literally “true” or “false” is the wrong question. The function of these beliefs is to construct a social reality that binds the tribe or sect and gives it a moral purpose.
Pinker acknowledges that rationality is not merely a matter of divorcing yourself from mythology. Of course, doing so is necessary if we want to seek truth, but we must also keep in mind the social function that the mythology mindset plays, and the psychological needs it satisfies. He does not commit the classic blunder that rationality is dependent on one’s goals, and that all truly rational agents will pursue the same set of actions. In fact, he embraces the fact that rationality is goal-independent, what we might call here the orthogonality thesis.
The book might be called “a version of the sequences that cites more primary sources.” Pinker quotes Hume heavily, which gives you the sense—accurately, I might add—that much of what we call “rationality” was invented centuries ago, not recently. I particularly like the sentiment he expresses towards the Enlightenment in this passage,
Bertrand Russell famously said, “It is undesirable to believe a proposition when there is no ground whatsoever for supposing it is true.” The key to understanding rampant irrationality is to recognize that Russell’s statement is not a truism but a revolutionary manifesto. For most of human history and prehistory, there were no grounds for supposing that propositions about remote worlds were true. But beliefs about them could be empowering or inspirational, and that made them desirable enough.
Russell’s maxim is the luxury of a technologically advanced society with science, history, journalism, and their infrastructure of truth-seeking, including archival records, digital datasets, high-tech instruments, and communities of editing, fact-checking, and peer review. We children of the Enlightenment embrace the radical creed of universal realism: we hold that all our beliefs should fall within the reality mindset.
What I didn’t like as much
For those expecting everything from the sequences to be represented, you will be let down. For example, he says little more about quantum mechanics other than that “most physicists believe there is irreducible randomness in the subatomic realm of quantum mechanics”. Compare that to the sequence on quantum mechanics here which forcefully argued for the deterministic many worlds interpretation.
His section on fallacies is, in my opinion, outdated rationality. He presents fallacies as common errors of reasoning which we can point out in other people. As an example, he analyzes a passage from Andrew Yang’s presidential campaign, which claimed, “The smartest people in the world now predict that ⅓ of Americans will lose their job to automation in 12 years.” Pinker labels such reasoning a “mild example of the argument from authority”.
The problem with this analysis is that, from a Bayesian point of view, Yang’s statement is perfectly valid evidence for his thesis. In another section Pinker refers to the sunk cost fallacy, but without mentioning how that fallacy might be perfectly rational too. More generally, common formal and informal fallacies can be seen as instances of Bayesian reasoning, a thesis explained by Kaj Sotala’s essay, “Fallacies as weak Bayesian evidence”. Pinker missed some opportunities to make this point clear.
For a book whose title is “Rationality: What It Is, Why It Seems Scarce, Why It Matters”, Pinker spends relatively little time on the last part: why it matters. Only his last chapter was about why rationality matters, and in my opinion, it was the weakest part of the book.
In regards to personal life, Pinker documents a dizzying array of errors afflicting us on a daily basis, from hyperbolic discounting and taboo trade-offs to the availability heuristic. Yet rather than show how an understanding of these errors could help us succeed at our goals, he retreats into the abstract, and speaks about how rationality could help us all build a better democracy—an ironic defense, given what he had just told us about how the mythological mindset can interact with our politics.
Just about the only thing Pinker had to say about how rationality could help us personally was a single study by Wändi Bruine de Bruin and her colleagues, who found that after holding a few factors constant, such as intelligence and socioeconomic status, competence in reasoning and decision making was correlated with positive life outcomes. Pinker concedes that “all this still falls short of proving causation” but concludes that this research “entitles us to vest some credence in the causal conclusion that competence in reasoning can protect a person from the misfortunes of life.”
Overall, I was not so much persuaded that rationality self-help matters, but Pinker did show that rationality as a general cognitive trait is powerful; after all, humans are doing quite fine compared to other animals. Whether this makes you less interested in rationality depends on why you’re interested. If you are interested in truth, especially in those abstract mythology-adjacent realms, then rationality is the subject for you. For everyone else, I’m not yet convinced.
A review of Steven Pinker’s new book on rationality
Steven Pinker’s new book on rationality came out today. I figured someone on LessWrong would write a review for it, so I might as well be the one to do it.
Unlike Pinker’s prior books, such as The Blank Slate and The Better Angels of Our Nature, this book lacks a straightforward empirical thesis. Instead, he mirrors the sequences by building a science of rationality and then tries to convince the reader that rationality is important, both personally and socially.
Unfortunately, long-time readers of LessWrong are unlikely to learn much from Pinker’s new book; his content is too similar to the content in the sequences. An upside is that Pinker’s treatment is more concise, and his style more closely resembles mainstream thought. Consequently, I’m tempted to recommend this book to people who might otherwise be turned away by Rationality: From A to Z.
He starts by asking a simple question: how come it seems like everyone is so irrational? Pointing to religion, conspiracy theorists, ghost-believers, anti-vaxxers, alternative medicine adherents, and postmodernists, Pinker makes a good case that there’s a lot of irrationality in the world. On the other hand, he continues, shouldn’t humans have evolved to be more rational? How could such persistent, widespread irrationality be so common in humans, if our survival impinges on our ability to reason?
Pinker provides a simple answer: humans are very rational animals, just not in every domain. In those domains on which our survival depended, such as finding and eating food, humans are much less clueless than you might have been lead to believe. Pinker provides the example of the San people of the Kalahari Desert in southern Africa, who, despite their mythological beliefs, are stunningly successful at hunting prey. He cites Louis Liebenberg, who documented how the San people use Bayesian reasoning to hunt, applying it to footprints and animal droppings in order to build an accurate picture of their natural world: a dry desert on which they have subsisted for over a hundred thousand years.
It’s not hard to see this dual phenomenon of rationality and irrationality reflected in the modern day: many young Earth creationists believe that the moon’s craters were literally planted by God to give the appearance of old age, but these same people rarely apply the same standards of reason to matters in their ordinary life.
Yet, as Pinker observes, sometimes even when our life and money does depend on our rationality, we still fail. For instance, most people consistently fail to save for retirement. Why? The answer here is simple: life today is a lot different than the lives of our ancestors. What might have been a threat 10,000 years ago—such as a tiger in the bushes—is no longer a major threat; conversely, some threats—like car crashes—are entirely new, and thus, the human brain is ill-equipped to evaluate them rationally.
Pinker’s book proceeds by presenting a textbook view of the science of rationality, including cognitive biases, formal logic, Bayesian inference, correlation and causation, statistical decision theory, and game theory. There isn’t much to complain about here: Pinker is a great writer, and presents these ideas with impressive clarity. However, the content in these chapters rarely departs from the mainstream exposition of these subjects. Given that I already knew most of the details, I was left a tad bored.
To prevent you from being bored as well, I won’t summarize the book’s main contents. (You can go and read his book if you want to know all the details.) Instead, I’ll draw my attention to some parts I liked, and some parts I didn’t like as much.
What I liked
First off, Pinker cited the rationalist community as an example of a group of good reasoners,
I was pleasantly surprised to see such a positive review of this community, given his previous assessment of AI risk arguments from LessWrong, which he bizarrely conflated with unfounded fears of the “Robopocalypse”. Perhaps this statement, and his semi-recent interaction with Stuart Russell is evidence that he is changing his mind.
In this book, Pinker demonstrates that he can be a good summarizer of other people. While this book contains almost no novel research, his skillfully ties together a ton of experiments in behavioral economics, theoretical models of rationality, and even signal detection theory. At the same time, it also seemed like the right length for a book trying to explain the basics of rationality, striking a nice balance between detail and book length. Gone are the days of needing to send someone a 1000+ page book to get them started on the whole “rationality” thing.
At no point did I feel that he was simply pushing an agenda. To be sure, at points, he drew from politics, religion, and personal experience to illustrate some aspect of irrationality, but these examples were secondary, as a way of making the ideas concrete; they were not his main focus.
Compared to his previous work, this book isn’t likely to get him into hot water. Nearly everything, besides a few of his examples of irrationality, are part of the standard consensus in cognitive science and statistics. The most controversial chapter is probably chapter 10 in which he explains myside bias, building on the recent work of Keith E. Stanovich. That said, given that his examples are broad and varied—criticizing dogmas on both the left and right—it’s not hard to see how some people might feel “the book isn’t for me.”
His book was not a mere recitation of biases or fallacies either: it emphasized what I view as a core principle of rationality, of actually taking your beliefs seriously, and acting on those beliefs. He refers to taking your beliefs seriously as “the reality mindset” and contrasts it with the “mythology mindset.” Many of us on LessWrong will know that this psychological dichotomy sometimes goes by other names, such as “near and far view” and “separate non-overlapping magisteria”. Pinker explains the essence of the mythology mindset,
Pinker acknowledges that rationality is not merely a matter of divorcing yourself from mythology. Of course, doing so is necessary if we want to seek truth, but we must also keep in mind the social function that the mythology mindset plays, and the psychological needs it satisfies. He does not commit the classic blunder that rationality is dependent on one’s goals, and that all truly rational agents will pursue the same set of actions. In fact, he embraces the fact that rationality is goal-independent, what we might call here the orthogonality thesis.
The book might be called “a version of the sequences that cites more primary sources.” Pinker quotes Hume heavily, which gives you the sense—accurately, I might add—that much of what we call “rationality” was invented centuries ago, not recently. I particularly like the sentiment he expresses towards the Enlightenment in this passage,
What I didn’t like as much
For those expecting everything from the sequences to be represented, you will be let down. For example, he says little more about quantum mechanics other than that “most physicists believe there is irreducible randomness in the subatomic realm of quantum mechanics”. Compare that to the sequence on quantum mechanics here which forcefully argued for the deterministic many worlds interpretation.
His section on fallacies is, in my opinion, outdated rationality. He presents fallacies as common errors of reasoning which we can point out in other people. As an example, he analyzes a passage from Andrew Yang’s presidential campaign, which claimed, “The smartest people in the world now predict that ⅓ of Americans will lose their job to automation in 12 years.” Pinker labels such reasoning a “mild example of the argument from authority”.
The problem with this analysis is that, from a Bayesian point of view, Yang’s statement is perfectly valid evidence for his thesis. In another section Pinker refers to the sunk cost fallacy, but without mentioning how that fallacy might be perfectly rational too. More generally, common formal and informal fallacies can be seen as instances of Bayesian reasoning, a thesis explained by Kaj Sotala’s essay, “Fallacies as weak Bayesian evidence”. Pinker missed some opportunities to make this point clear.
For a book whose title is “Rationality: What It Is, Why It Seems Scarce, Why It Matters”, Pinker spends relatively little time on the last part: why it matters. Only his last chapter was about why rationality matters, and in my opinion, it was the weakest part of the book.
In regards to personal life, Pinker documents a dizzying array of errors afflicting us on a daily basis, from hyperbolic discounting and taboo trade-offs to the availability heuristic. Yet rather than show how an understanding of these errors could help us succeed at our goals, he retreats into the abstract, and speaks about how rationality could help us all build a better democracy—an ironic defense, given what he had just told us about how the mythological mindset can interact with our politics.
Just about the only thing Pinker had to say about how rationality could help us personally was a single study by Wändi Bruine de Bruin and her colleagues, who found that after holding a few factors constant, such as intelligence and socioeconomic status, competence in reasoning and decision making was correlated with positive life outcomes. Pinker concedes that “all this still falls short of proving causation” but concludes that this research “entitles us to vest some credence in the causal conclusion that competence in reasoning can protect a person from the misfortunes of life.”
Overall, I was not so much persuaded that rationality self-help matters, but Pinker did show that rationality as a general cognitive trait is powerful; after all, humans are doing quite fine compared to other animals. Whether this makes you less interested in rationality depends on why you’re interested. If you are interested in truth, especially in those abstract mythology-adjacent realms, then rationality is the subject for you. For everyone else, I’m not yet convinced.