It has to be correct and useful, and correctness only matters for winning inasmuch as it entails usefulness. Having a lot of correct information about golf is no good if you want to be a great chef.
Having correct object-level information and having a correct epistemological process and belief system are two different things. An incorrect epistemological process is likely to reject information it doesn’t like.
Right, that’s a possible response: the sacrifice of epistemic rationality for instrumental rationality can’t be isolated. If your epistemic process leads to beneficial incorrect conclusions in one area, your epistemic process is broadly incorrect, and will necessarily lead to harmful incorrect conclusions elsewhere.
But people seem to be pretty good at compartmentalizing. Robert Aumann is an Orthodox Jew. (Which is the shoal that some early statements of the general-factor-of-correctness position broke on, IIRC.) And there are plenty of very instrumentally rational Christians in the world.
On the other hand, maybe people who’ve been exposed to all this epistemic talk won’t be so willing to compartmentalize—or at least to compartmentalize the sorts of things early LW used as examples of flaws in reasoning.
I’m not sure how to square “rejecting religion is the preschool entrance exam of rationality” with “people are pretty good at compartmentalizing”. Certainly there are parts of the Sequences that imply the insignificance of compartmentalization.
I personally recall, maybe seven years ago, having to break the news to someone that Aumann is an Orthodox Jew. This was a big deal at the time! We tend to forget how different the rationalist consensus is from the contents of the Sequences.
Every once in a while someone asks me or someone I know about what “postrationality” is, and they’re never happy with the answer—“isn’t that just rationality?” Sure, to an extent; but to the extent that it is, it’s because “postrationality” won. And to tie this into the discussion elsewhere in the thread: postrationality mostly came out of a now-defunct secret society.
Your line of reasoning re: Aumann feels akin to “X billionaire dropped out of high school / college, ergo you can drop out, too”. Sure, perhaps some can get by with shoddy beliefs, but why disadvantage yourself?
I wouldn’t use rejection of religion as a signal
Point of clarification: are you claiming that rejecting religion provides no information about someone’s rationality, or that it provides insignificant information?
If postrationality really did win, I don’t know that it should have. I haven’t been convinced that knowingly holding false beliefs is instrumentally rational in any realistic world, as I outlined below.
Your line of reasoning re: Aumann feels akin to “X billionaire dropped out of high school / college, erg you can drop out, too”. Sure, perhaps some can get by with shoddy beliefs, but why disadvantage yourself?
If people are pretty good at compartmentalization, it’s at least not immediately clear that there’s a disadvantage here.
It’s also not immediately clear that there’s a general factor of correctness, or, if there is, what the correctness distribution looks like.
It’s at least defensible position that there is a general factor of correctness, but that it isn’t useful, because it’s just an artifact of most people being pretty dumb, and there’s no general factor within the set of people who aren’t just pretty dumb. I do think there’s a general factor of not being pretty dumb, but I’m not sure about a general factor of correctness beyond that.
It seems probable that “ignore the people who are obviously pretty dumb” is a novel and worthwhile message for some people, but not for others. I grew up in a bubble where everyone already knew to do that, so it’s not for me, but maybe there are people who draw utility from being informed that they don’t have to take seriously genuine believers in astrology or homeopathy or whatever.
Point of clarification: are you claiming that rejecting religion provides no information about someone’s rationality, or that it provides insignificant information?
In a purely statistical sense, rejecting religion almost certainly provides information about someone’s rationality, because things tend to provide information about other things. Technically, demographics provide information about someone’s rationality. But not information that’s useful for updating about specific people.
Religious affiliation is a useful source of information about domain-specific rationality in areas that don’t lend themselves well to compartmentalization. There was a time when it made sense to discount the opinions of Mormon archaeologists about the New World, although now that they’ve been through some time of their religiously-motivated claims totally failing to pan out it probably lends itself to compartmentalization alright.
On the other hand, I wouldn’t discount the opinions of Mormon historical linguists about Proto-Uralic. But I would discount the opinions of astrologers about Proto-Uralic, unless they have other historical-linguistic work in areas that I can judge the quality of and that work seems reasonable.
If postrationality really did win, I don’t know that it should have. I haven’t been convinced that knowingly holding false beliefs is instrumentally rational in any realistic world, as I outlined below.
Postrationality isn’t about knowingly holding false beliefs. Insofar as postrationality has a consensus that can be distilled to one sentence, it’s “you can’t kick everything upstairs to the slow system, so you should train the fast system.” But that’s a simplification.
“you can’t kick everything upstairs to the slow system, so you should train the fast system.”
I know that postrationality can’t be distilled to a single sentence and I’m picking on it a bit unfairly, but “post”-rationality can’t differentiate itself from rationality on that. Eliezer wrote about system 1 and system 2 in 2006:
When people think of “emotion” and “rationality” as opposed, I suspect that they are really thinking of System 1 and System 2—fast perceptual judgments versus slow deliberative judgments. Deliberative judgments aren’t always true, and perceptual judgments aren’t always false; so it is very important to distinguish that dichotomy from “rationality”. Both systems can serve the goal of truth, or defeat it, according to how they are used.
And it’s not like this statement was ever controversial on LW.
You can’t get any more “core LW rationality” than the fricking Sequences. If someone thinks that rationality is about forcing everything into System 2 then, well, they should reread the fricking Sequences.
Minor: but I appreciate you using the word “fricking”, instead of the obvious alternative. For me, it feels like it gets the emphaticness across just as well, without the crudeness.
It has to be correct and useful, and correctness only matters for winning inasmuch as it entails usefulness. Having a lot of correct information about golf is no good if you want to be a great chef.
Having correct object-level information and having a correct epistemological process and belief system are two different things. An incorrect epistemological process is likely to reject information it doesn’t like.
And having correct and relevant object-level information is a third thing.
Right, that’s a possible response: the sacrifice of epistemic rationality for instrumental rationality can’t be isolated. If your epistemic process leads to beneficial incorrect conclusions in one area, your epistemic process is broadly incorrect, and will necessarily lead to harmful incorrect conclusions elsewhere.
But people seem to be pretty good at compartmentalizing. Robert Aumann is an Orthodox Jew. (Which is the shoal that some early statements of the general-factor-of-correctness position broke on, IIRC.) And there are plenty of very instrumentally rational Christians in the world.
On the other hand, maybe people who’ve been exposed to all this epistemic talk won’t be so willing to compartmentalize—or at least to compartmentalize the sorts of things early LW used as examples of flaws in reasoning.
Which is why you shouldn’t have written “necessarily”.
I’m not sure how to square “rejecting religion is the preschool entrance exam of rationality” with “people are pretty good at compartmentalizing”. Certainly there are parts of the Sequences that imply the insignificance of compartmentalization.
I personally recall, maybe seven years ago, having to break the news to someone that Aumann is an Orthodox Jew. This was a big deal at the time! We tend to forget how different the rationalist consensus is from the contents of the Sequences.
Every once in a while someone asks me or someone I know about what “postrationality” is, and they’re never happy with the answer—“isn’t that just rationality?” Sure, to an extent; but to the extent that it is, it’s because “postrationality” won. And to tie this into the discussion elsewhere in the thread: postrationality mostly came out of a now-defunct secret society.
Your line of reasoning re: Aumann feels akin to “X billionaire dropped out of high school / college, ergo you can drop out, too”. Sure, perhaps some can get by with shoddy beliefs, but why disadvantage yourself?
Point of clarification: are you claiming that rejecting religion provides no information about someone’s rationality, or that it provides insignificant information?
If postrationality really did win, I don’t know that it should have. I haven’t been convinced that knowingly holding false beliefs is instrumentally rational in any realistic world, as I outlined below.
If people are pretty good at compartmentalization, it’s at least not immediately clear that there’s a disadvantage here.
It’s also not immediately clear that there’s a general factor of correctness, or, if there is, what the correctness distribution looks like.
It’s at least defensible position that there is a general factor of correctness, but that it isn’t useful, because it’s just an artifact of most people being pretty dumb, and there’s no general factor within the set of people who aren’t just pretty dumb. I do think there’s a general factor of not being pretty dumb, but I’m not sure about a general factor of correctness beyond that.
It seems probable that “ignore the people who are obviously pretty dumb” is a novel and worthwhile message for some people, but not for others. I grew up in a bubble where everyone already knew to do that, so it’s not for me, but maybe there are people who draw utility from being informed that they don’t have to take seriously genuine believers in astrology or homeopathy or whatever.
In a purely statistical sense, rejecting religion almost certainly provides information about someone’s rationality, because things tend to provide information about other things. Technically, demographics provide information about someone’s rationality. But not information that’s useful for updating about specific people.
Religious affiliation is a useful source of information about domain-specific rationality in areas that don’t lend themselves well to compartmentalization. There was a time when it made sense to discount the opinions of Mormon archaeologists about the New World, although now that they’ve been through some time of their religiously-motivated claims totally failing to pan out it probably lends itself to compartmentalization alright.
On the other hand, I wouldn’t discount the opinions of Mormon historical linguists about Proto-Uralic. But I would discount the opinions of astrologers about Proto-Uralic, unless they have other historical-linguistic work in areas that I can judge the quality of and that work seems reasonable.
Postrationality isn’t about knowingly holding false beliefs. Insofar as postrationality has a consensus that can be distilled to one sentence, it’s “you can’t kick everything upstairs to the slow system, so you should train the fast system.” But that’s a simplification.
I know that postrationality can’t be distilled to a single sentence and I’m picking on it a bit unfairly, but “post”-rationality can’t differentiate itself from rationality on that. Eliezer wrote about system 1 and system 2 in 2006:
And it’s not like this statement was ever controversial on LW.
You can’t get any more “core LW rationality” than the fricking Sequences. If someone thinks that rationality is about forcing everything into System 2 then, well, they should reread the fricking Sequences.
Minor: but I appreciate you using the word “fricking”, instead of the obvious alternative. For me, it feels like it gets the emphaticness across just as well, without the crudeness.