I’m not sure how to square “rejecting religion is the preschool entrance exam of rationality” with “people are pretty good at compartmentalizing”. Certainly there are parts of the Sequences that imply the insignificance of compartmentalization.
I personally recall, maybe seven years ago, having to break the news to someone that Aumann is an Orthodox Jew. This was a big deal at the time! We tend to forget how different the rationalist consensus is from the contents of the Sequences.
Every once in a while someone asks me or someone I know about what “postrationality” is, and they’re never happy with the answer—“isn’t that just rationality?” Sure, to an extent; but to the extent that it is, it’s because “postrationality” won. And to tie this into the discussion elsewhere in the thread: postrationality mostly came out of a now-defunct secret society.
Your line of reasoning re: Aumann feels akin to “X billionaire dropped out of high school / college, ergo you can drop out, too”. Sure, perhaps some can get by with shoddy beliefs, but why disadvantage yourself?
I wouldn’t use rejection of religion as a signal
Point of clarification: are you claiming that rejecting religion provides no information about someone’s rationality, or that it provides insignificant information?
If postrationality really did win, I don’t know that it should have. I haven’t been convinced that knowingly holding false beliefs is instrumentally rational in any realistic world, as I outlined below.
Your line of reasoning re: Aumann feels akin to “X billionaire dropped out of high school / college, erg you can drop out, too”. Sure, perhaps some can get by with shoddy beliefs, but why disadvantage yourself?
If people are pretty good at compartmentalization, it’s at least not immediately clear that there’s a disadvantage here.
It’s also not immediately clear that there’s a general factor of correctness, or, if there is, what the correctness distribution looks like.
It’s at least defensible position that there is a general factor of correctness, but that it isn’t useful, because it’s just an artifact of most people being pretty dumb, and there’s no general factor within the set of people who aren’t just pretty dumb. I do think there’s a general factor of not being pretty dumb, but I’m not sure about a general factor of correctness beyond that.
It seems probable that “ignore the people who are obviously pretty dumb” is a novel and worthwhile message for some people, but not for others. I grew up in a bubble where everyone already knew to do that, so it’s not for me, but maybe there are people who draw utility from being informed that they don’t have to take seriously genuine believers in astrology or homeopathy or whatever.
Point of clarification: are you claiming that rejecting religion provides no information about someone’s rationality, or that it provides insignificant information?
In a purely statistical sense, rejecting religion almost certainly provides information about someone’s rationality, because things tend to provide information about other things. Technically, demographics provide information about someone’s rationality. But not information that’s useful for updating about specific people.
Religious affiliation is a useful source of information about domain-specific rationality in areas that don’t lend themselves well to compartmentalization. There was a time when it made sense to discount the opinions of Mormon archaeologists about the New World, although now that they’ve been through some time of their religiously-motivated claims totally failing to pan out it probably lends itself to compartmentalization alright.
On the other hand, I wouldn’t discount the opinions of Mormon historical linguists about Proto-Uralic. But I would discount the opinions of astrologers about Proto-Uralic, unless they have other historical-linguistic work in areas that I can judge the quality of and that work seems reasonable.
If postrationality really did win, I don’t know that it should have. I haven’t been convinced that knowingly holding false beliefs is instrumentally rational in any realistic world, as I outlined below.
Postrationality isn’t about knowingly holding false beliefs. Insofar as postrationality has a consensus that can be distilled to one sentence, it’s “you can’t kick everything upstairs to the slow system, so you should train the fast system.” But that’s a simplification.
“you can’t kick everything upstairs to the slow system, so you should train the fast system.”
I know that postrationality can’t be distilled to a single sentence and I’m picking on it a bit unfairly, but “post”-rationality can’t differentiate itself from rationality on that. Eliezer wrote about system 1 and system 2 in 2006:
When people think of “emotion” and “rationality” as opposed, I suspect that they are really thinking of System 1 and System 2—fast perceptual judgments versus slow deliberative judgments. Deliberative judgments aren’t always true, and perceptual judgments aren’t always false; so it is very important to distinguish that dichotomy from “rationality”. Both systems can serve the goal of truth, or defeat it, according to how they are used.
And it’s not like this statement was ever controversial on LW.
You can’t get any more “core LW rationality” than the fricking Sequences. If someone thinks that rationality is about forcing everything into System 2 then, well, they should reread the fricking Sequences.
Minor: but I appreciate you using the word “fricking”, instead of the obvious alternative. For me, it feels like it gets the emphaticness across just as well, without the crudeness.
I’m not sure how to square “rejecting religion is the preschool entrance exam of rationality” with “people are pretty good at compartmentalizing”. Certainly there are parts of the Sequences that imply the insignificance of compartmentalization.
I personally recall, maybe seven years ago, having to break the news to someone that Aumann is an Orthodox Jew. This was a big deal at the time! We tend to forget how different the rationalist consensus is from the contents of the Sequences.
Every once in a while someone asks me or someone I know about what “postrationality” is, and they’re never happy with the answer—“isn’t that just rationality?” Sure, to an extent; but to the extent that it is, it’s because “postrationality” won. And to tie this into the discussion elsewhere in the thread: postrationality mostly came out of a now-defunct secret society.
Your line of reasoning re: Aumann feels akin to “X billionaire dropped out of high school / college, ergo you can drop out, too”. Sure, perhaps some can get by with shoddy beliefs, but why disadvantage yourself?
Point of clarification: are you claiming that rejecting religion provides no information about someone’s rationality, or that it provides insignificant information?
If postrationality really did win, I don’t know that it should have. I haven’t been convinced that knowingly holding false beliefs is instrumentally rational in any realistic world, as I outlined below.
If people are pretty good at compartmentalization, it’s at least not immediately clear that there’s a disadvantage here.
It’s also not immediately clear that there’s a general factor of correctness, or, if there is, what the correctness distribution looks like.
It’s at least defensible position that there is a general factor of correctness, but that it isn’t useful, because it’s just an artifact of most people being pretty dumb, and there’s no general factor within the set of people who aren’t just pretty dumb. I do think there’s a general factor of not being pretty dumb, but I’m not sure about a general factor of correctness beyond that.
It seems probable that “ignore the people who are obviously pretty dumb” is a novel and worthwhile message for some people, but not for others. I grew up in a bubble where everyone already knew to do that, so it’s not for me, but maybe there are people who draw utility from being informed that they don’t have to take seriously genuine believers in astrology or homeopathy or whatever.
In a purely statistical sense, rejecting religion almost certainly provides information about someone’s rationality, because things tend to provide information about other things. Technically, demographics provide information about someone’s rationality. But not information that’s useful for updating about specific people.
Religious affiliation is a useful source of information about domain-specific rationality in areas that don’t lend themselves well to compartmentalization. There was a time when it made sense to discount the opinions of Mormon archaeologists about the New World, although now that they’ve been through some time of their religiously-motivated claims totally failing to pan out it probably lends itself to compartmentalization alright.
On the other hand, I wouldn’t discount the opinions of Mormon historical linguists about Proto-Uralic. But I would discount the opinions of astrologers about Proto-Uralic, unless they have other historical-linguistic work in areas that I can judge the quality of and that work seems reasonable.
Postrationality isn’t about knowingly holding false beliefs. Insofar as postrationality has a consensus that can be distilled to one sentence, it’s “you can’t kick everything upstairs to the slow system, so you should train the fast system.” But that’s a simplification.
I know that postrationality can’t be distilled to a single sentence and I’m picking on it a bit unfairly, but “post”-rationality can’t differentiate itself from rationality on that. Eliezer wrote about system 1 and system 2 in 2006:
And it’s not like this statement was ever controversial on LW.
You can’t get any more “core LW rationality” than the fricking Sequences. If someone thinks that rationality is about forcing everything into System 2 then, well, they should reread the fricking Sequences.
Minor: but I appreciate you using the word “fricking”, instead of the obvious alternative. For me, it feels like it gets the emphaticness across just as well, without the crudeness.