This may be connected to a more general problem: One is trying to extrapolate on to a continuum of how rational people can be by referencing a single bit. Whether that bit is theism or AGW, that’s still not going to be that helpful. More bits of data is better.
Theism is a symptom of excess compartmentalisation, of not realising that absence of evidence is evidence of absence, of belief in belief, of privileging the hypothesis, and similar failings. But these are not intrinsically huge problems.
All of these are small problems when they come up only in a narrow context. How often does someone who privileges the hypothesis only do so in a single context?
I think this a good argument for collecting more points that Less Wrongers can use in real life to guage someone’s rationality. I like to bring up Newcomb’s problem and ask for reasons for their choice, and if they’re two-boxers I try to persuade them to one-box. One intelligent friend was quickly persuaded to one-box when I outlined the expected results, whereas another person eventually said “I think you should just go with your instincts”. I felt that gave me a lot of information about their thinking, but more points to bring up would be good.
It would be especially good to find contrarian beliefs to ask about for different groups so as to more easily spot people who can think outside their group norm.
I don’t know how feasible this is, ultimately. The closest test I can think of is probably the Cognitive Reflection Test. (Which has the advantage of being a trio of little arithmetic brainteasers rather than anything that’ll trigger people’s politics detectors.)
This may be connected to a more general problem: One is trying to extrapolate on to a continuum of how rational people can be by referencing a single bit. Whether that bit is theism or AGW, that’s still not going to be that helpful. More bits of data is better.
All of these are small problems when they come up only in a narrow context. How often does someone who privileges the hypothesis only do so in a single context?
I think this a good argument for collecting more points that Less Wrongers can use in real life to guage someone’s rationality. I like to bring up Newcomb’s problem and ask for reasons for their choice, and if they’re two-boxers I try to persuade them to one-box. One intelligent friend was quickly persuaded to one-box when I outlined the expected results, whereas another person eventually said “I think you should just go with your instincts”. I felt that gave me a lot of information about their thinking, but more points to bring up would be good.
It would be especially good to find contrarian beliefs to ask about for different groups so as to more easily spot people who can think outside their group norm.
Mmmm. Trying to pick out rationality litmus tests seems like the kind of project EY was talking about in “The Correct Contrarian Cluster” and “Undiscriminating Skepticism”.
I don’t know how feasible this is, ultimately. The closest test I can think of is probably the Cognitive Reflection Test. (Which has the advantage of being a trio of little arithmetic brainteasers rather than anything that’ll trigger people’s politics detectors.)