Define proto-rationalists as those respondents to the Less Wrong survey who indicate they have been in the community for less than six months and have zero karma (usually indicative of never having posted a comment). And define experienced rationalists as those respondents to the Less Wrong survey who indicate they have been in the community for over two years and have >1000 karma (usually indicative of having written many well-received posts).
I don’t like this appropriation of the term “rational” (even with the “-ist” suffix), and in fact I find it somewhat offensive.
[ Warning: Trolling ahead ] But since words are arbitrary placeholders, let’s play a little game and replace the word “rationalist” with another randomly generated string, such as “cultist” (which you might possibly find offensive, but remember, it’s just a placeholder).
So what does your data say?
Proto-cultists have give a higher average probability of cryonics success than committed cultists. But this isn’t necessarily particularly informative, because averaging probabilities from different estimators doesn’t really tell us much (consider scenario A where half of the respondents say p = 1 and half say p = 0, and scenario B where all the respondents say p = 0.5. The arithmetic mean is the same, but the scenarios are completely different). The harmonic mean can be a better way of averaging probabilities. But anyway, let’s assume that the distribution of the responses is well-behaved enough that it holds that a randomly sampled proto-cultist is more likely to assign an higher probability of cryonics success than a randomly sampled committed cultist (you can test this hypothesis on the data).
On the other hand, proto-cultists are much less likely to be signed up for cryonics than committed cultists (in fact, none of the proto-cultist are signed up).
What is the correlation between belief in cryonics success and being signed up for cryonics? I don’t know since it isn’t reported neither here nor in the survey results post (maybe it was computed by found to be not significant, since IIUC there was a significance cutoff for correlations in the survey results post). Do committed cultists who sign up for cryonics do it because they assign a high probability to its success, or despite they assign it a low probability? I have no way of knowing. Or, actually, I could look at the data, but I wont, since you wrote the post trying to make a point from the data, hence the burden of providing a meaningful statistic analysis was on you.
Let’s try to interpret this finding: Cryonics is a weird belief. Proto-cultists didn’t spend much time researching it and thinking on it, and because their typical background (mostly computer science students) find it somewhat plausible, but they don’t really trust their estimate very much. Committed cultists, on the other hand, have more polarized beliefs. Being in the cult might have actually stripped them away of their instrumental rationality (or selected for irrational people) so they decide against their explicit beliefs. Or they respond to social pressures, since cryonics is high status in the cult and has been explicitly endorsed by one of the cult elders and author of the Sacred Scrip-...Sequences. Or both. Oops.
[ End of trolling ]
The bottom line of my deliberately uncharitable post is:
Don’t use words lightly. Words aren’t really just syntactic labels, they convey implicit meaning. Using words with an implicit positive meaning (“experienced rationalist”) to refer to the core members of a community, naturally suggest a charitable interpretation (that they are smart). Using words with an implicit negative meaning (“committed cultists”) suggests an uncharitable interpretation (that they are brainwashed, groupthinking, too much preoccupied with costly status signalling that has no value outside the group).
If you are trying to make a point from data, provide relevant statistics.
Yeah. Suppose we were talking about a newage-ish cult where the founder has arranged himself to be flown to Tibet for the sky burial when he dies. They can very well have exact same statistics on their online forum.
I don’t like this appropriation of the term “rational” (even with the “-ist” suffix), and in fact I find it somewhat offensive.
[ Warning: Trolling ahead ]
But since words are arbitrary placeholders, let’s play a little game and replace the word “rationalist” with another randomly generated string, such as “cultist” (which you might possibly find offensive, but remember, it’s just a placeholder).
So what does your data say?
Proto-cultists have give a higher average probability of cryonics success than committed cultists.
But this isn’t necessarily particularly informative, because averaging probabilities from different estimators doesn’t really tell us much (consider scenario A where half of the respondents say p = 1 and half say p = 0, and scenario B where all the respondents say p = 0.5. The arithmetic mean is the same, but the scenarios are completely different). The harmonic mean can be a better way of averaging probabilities.
But anyway, let’s assume that the distribution of the responses is well-behaved enough that it holds that a randomly sampled proto-cultist is more likely to assign an higher probability of cryonics success than a randomly sampled committed cultist (you can test this hypothesis on the data).
On the other hand, proto-cultists are much less likely to be signed up for cryonics than committed cultists (in fact, none of the proto-cultist are signed up).
What is the correlation between belief in cryonics success and being signed up for cryonics? I don’t know since it isn’t reported neither here nor in the survey results post (maybe it was computed by found to be not significant, since IIUC there was a significance cutoff for correlations in the survey results post).
Do committed cultists who sign up for cryonics do it because they assign a high probability to its success, or despite they assign it a low probability? I have no way of knowing.
Or, actually, I could look at the data, but I wont, since you wrote the post trying to make a point from the data, hence the burden of providing a meaningful statistic analysis was on you.
Let’s try to interpret this finding:
Cryonics is a weird belief. Proto-cultists didn’t spend much time researching it and thinking on it, and because their typical background (mostly computer science students) find it somewhat plausible, but they don’t really trust their estimate very much.
Committed cultists, on the other hand, have more polarized beliefs. Being in the cult might have actually stripped them away of their instrumental rationality (or selected for irrational people) so they decide against their explicit beliefs. Or they respond to social pressures, since cryonics is high status in the cult and has been explicitly endorsed by one of the cult elders and author of the Sacred Scrip-...Sequences. Or both.
Oops.
[ End of trolling ]
The bottom line of my deliberately uncharitable post is:
Don’t use words lightly. Words aren’t really just syntactic labels, they convey implicit meaning. Using words with an implicit positive meaning (“experienced rationalist”) to refer to the core members of a community, naturally suggest a charitable interpretation (that they are smart). Using words with an implicit negative meaning (“committed cultists”) suggests an uncharitable interpretation (that they are brainwashed, groupthinking, too much preoccupied with costly status signalling that has no value outside the group).
If you are trying to make a point from data, provide relevant statistics.
Yeah. Suppose we were talking about a newage-ish cult where the founder has arranged himself to be flown to Tibet for the sky burial when he dies. They can very well have exact same statistics on their online forum.