I can see that it would be useful to have a fast filter for rationality, but how possible is it?
There are some opinions which are irrational (frex, there doesn’t seem to be any solid arguments for the idea that homosexuality is bad, and if it can’t be eliminated, it should at least be kept out of public view), but that’s not the same thing as having a positive test for rationality.
There comes a point when there’s no substitute for actual knowledge, and in this case, it means looking at people’s thinking rather than their opinions.
I suggest asking people what they’ve changed their mind about, and why. The opinion change could be tribal, too, but at least it’s not a completely static view of the other person’s mind.
One other test—does the person judge the things they like by the most attractive examples, and the things they dislike by the least attractive examples? This test is faster than asking questions.
ISTM that we could summarize Eliezer’s post, conclusions, subsequent discussion, and much previous LW material thus: “there are no reliable epistemic shortcuts”.
I was wondering if there was a top level post explicitly about the need to have tools for checking the territory now and then because your map is necessarily incomplete.
The messy thing is that you need to have tools and habits for being able to notice it when reality is tugging on your sleeve or bashing you about the head and trying to find out what important thing you’ve missed—but if you formalize that procedure, you’re in a map again.
I disagree. Some surface features do correlate exceedingly well with epistemic rationality; it’s just harder to rule out false positives than false negatives.
I can see that it would be useful to have a fast filter for rationality, but how possible is it?
There are some opinions which are irrational (frex, there doesn’t seem to be any solid arguments for the idea that homosexuality is bad, and if it can’t be eliminated, it should at least be kept out of public view), but that’s not the same thing as having a positive test for rationality.
There comes a point when there’s no substitute for actual knowledge, and in this case, it means looking at people’s thinking rather than their opinions.
I suggest asking people what they’ve changed their mind about, and why. The opinion change could be tribal, too, but at least it’s not a completely static view of the other person’s mind.
One other test—does the person judge the things they like by the most attractive examples, and the things they dislike by the least attractive examples? This test is faster than asking questions.
ISTM that we could summarize Eliezer’s post, conclusions, subsequent discussion, and much previous LW material thus: “there are no reliable epistemic shortcuts”.
I was wondering if there was a top level post explicitly about the need to have tools for checking the territory now and then because your map is necessarily incomplete.
The messy thing is that you need to have tools and habits for being able to notice it when reality is tugging on your sleeve or bashing you about the head and trying to find out what important thing you’ve missed—but if you formalize that procedure, you’re in a map again.
I disagree. Some surface features do correlate exceedingly well with epistemic rationality; it’s just harder to rule out false positives than false negatives.