I thought that your survey was going to be the same sort of thing, so I didn’t develop mine further. Now that I see your survey is after something different, I’ll probably try mine again.
Oh. Damn. I should have communicated better back when I started toward this survey.
Your survey looked like “expert elicitation”—seeing what LW members think about controversial issues we actually care about, and especially what those most likely to have informed opinions think, as data on what might actually be true about those issues. I’d love to know the results but don’t plan to do it myself.
In this here survey, I’m after seeing what kinds of rationality do/don’t bear practical fruit in individuals’ lives. I’d long-run also like to investigate (though not with today’s survey) the degree to which there is/isn’t a single trait “rationality” that predicts accurate belief-formation across domains, how to measure such a trait, and what helps build that trait.
I thought that your survey was going to be the same sort of thing, so I didn’t develop mine further. Now that I see your survey is after something different, I’ll probably try mine again.
Oh. Damn. I should have communicated better back when I started toward this survey.
Your survey looked like “expert elicitation”—seeing what LW members think about controversial issues we actually care about, and especially what those most likely to have informed opinions think, as data on what might actually be true about those issues. I’d love to know the results but don’t plan to do it myself.
In this here survey, I’m after seeing what kinds of rationality do/don’t bear practical fruit in individuals’ lives. I’d long-run also like to investigate (though not with today’s survey) the degree to which there is/isn’t a single trait “rationality” that predicts accurate belief-formation across domains, how to measure such a trait, and what helps build that trait.