Strength of membership in the LW community was related to responses for most of the questions. There were 3 questions related to strength of membership: karma, sequence reading, and time in the community, and since they were all correlated with each other and showed similar patterns I standardized them and averaged them together into a single measure. Then I checked if this measure of strength in membership in the LW community was related to answers on each of the other questions, for the 822 respondents (described in this comment) who answered at least one of the probability questions and used percentages rather than decimals (since I didn’t want to take the time to recode the answers which were given as decimals).
All effects described below have p < .01 (I also indicate when there is a nonsignificant trend with p<.2). On questions with categories I wasn’t that rigorous—if there was a significant effect overall I just eyeballed the differences and reported which categories have the clearest difference (and I skipped some of the background questions which had tons of different categories and are hard to interpret).
Compared to those with a less strong membership in the LW community, those with a strong tie to the community are:
Background:
Gender—no difference
Age—no difference
Relationship Status—no difference
Sexual Orientation—no difference
Relationship Style—less likely to prefer monogamous, more likely to prefer polyamorous or to have no preference
Political Views—less likely to be socialist, more likely to be libertarian (but this is driven by the length of time in the community, which may reflect changing demographics—see my reply to this comment)
Religious Views—more likely to be atheist & not spiritual, especially less likely to be agnostic
Family Religion—no difference
Moral Views—more likely to be consequentialist
IQ—higher
Probabilities:
Many Worlds—higher
Aliens in the universe—lower (edited: I had mistakenly reversed the two aliens questions)
Aliens in our galaxy—trend towards lower (p=.04)
Supernatural—lower
God—lower
Religion—trend towards lower (p=.11, and this is statistically significant with a different analysis)
Cryonics—lower
Anti-Agathics—trend towards higher (p=.13) (this was the one question with a significant non-monotonic relationship: those with a moderately strong tie to the community had the highest probability estimate, while those with weak or strong ties had lower estimates)
Simulation—trend towards higher (p=.20)
Global Warming—higher
No Catastrophe—lower (i.e., think it is less likely that we will make it to 2100 without a catastrophe, i.e. think the chances of xrisk are higher)
Other Questions:
Singularity—sooner (this is statistically significant after truncating the outliers), and more likely to give an estimate rather than leave it blank
Type of XRisk—more likely to think that Unfriendly AI is the most likely XRisk
Cryonics Status—More likely to be signed up or to be considering it, less likely to be not planning to or to not have thought about it
Political Views—less likely to be socialist, more likely to be libertarian
I looked at this one a little more closely, and this difference in political views is driven almost entirely by the “time in community” measure of strength of membership in the LW community; it’s not even statistically significant with the other two. I’d guess that is because LW started out on Overcoming Bias, which is a relatively libertarian blog, so the old timers tend to share those views. We’ve also probably added more non-Americans over time, who are more likely to be socialist.
All of the other relationships in the above post hold up when we replace the original measure of membership strength with one that is only based on the two variables of karma & sequence reading, but this one does not.
Cryonics Status—More likely to be signed up or to be considering it, less likely to be not planning to or to not have thought about it
So long-time participants were less likely to believe that cryonics would work for them but more likely to sign up for it? Interesting. This could be driven by any of: fluke, greater rationality, greater age&income, less akrasia, more willingness to take long-shot bets based on shutting up and multiplying.
I looked into this a little more, and it looks like those who are strongly tied to the LW community are less likely to give high answers to p(cryonics) (p>50%), but not any more or less likely to give low answers (p<10%). That reduction in high answers could be a sign of greater rationality—less affect heuristic driven irrational exuberance about the prospects for cryonics—or just more knowledge about the topic. But I’m surprised that there’s no change in the frequency of low answers.
There is a similar pattern in the relationship between cryonics status and p(cryonics). Those who are signed up for cryonics don’t give a higher p(cryonics) on average than those who are not signed up, but they are less likely to give a probability under 10%. The group with the highest average p(cryonics) is those who aren’t signed up but are considering it, and that’s the group that’s most likely to give a probability over 50%.
Here are the results for p(cryonics) broken down by cryonics status, showing what percent of each group gave p(cryonics)<.1, what percent gave p(cryonics)>.5, and what the average p(cryonics) is for each group. (I’m expressing p(cryonics) here as probabilities from 0-1 because I think it’s easier to follow that way, since I’m giving the percent of people in each group.)
Never thought about it / don’t understand (n=26): 58% give p<.1, 8% give p>.5, mean p=.17 No, and not planning to (n=289): 60% give p<.1, 6% give p>.5, mean p=.14 No, but considering it (n=444): 38% give p < .1, 18% give p>.5, mean p=.27 Yes—signed up or just finishing up paperwork (n=36): 39% give p<.1, 8% give p>.5, mean p=.21 Overall: 47% give p<.1, 13% give p>.5, mean p=.22
The existential risk questions are a confounding factor here—if you think p(cryonics works) 80%, but p(xrisk ends civilization) 50%, that pulls down your p(successful revival) considerably.
I wondered about that, but p(cryonics) and p(xrisk) are actually uncorrelated, and the pattern of results for p(cryonics) remains the same when controlling statistically for p(xrisk).
I think the main reason for this is that these persons have simply spent more time thinking about cyronics compared to other people. By spending time on this forum they have had a good chance of running into a discussion which has inspired them to read about it and sign up. Or perhaps people who are interested in cyronics are also interested in other topics LW has to offer, and hence stay in this place. In either case, it follows that they are probably also more knowledgeable about cyronics and hence understand what cyrotechnology can realistically offer currently or in the near future.
In addition, these long-time guys might be more open to things such as cyronics in the ethical way.
I think the main reason for this is that these persons have simply spent more time thinking about cyronics compared to other people.
I don’t think this is obvious at all. If you had asked me before in advance which of the following 4 possible sign-pairs would be true with increasing time spent thinking about cryonics:
less credence, less sign-ups
less credence, more sign-ups
more credence, more sign-ups
more credence, less sign-ups
I would have said ‘obviously #3, since everyone starts from “that won’t ever work” and move up from there, and then one is that much more likely to sign up’
The actual outcome, #2, would be the one I would expect least. (Hence, I am strongly suspicious of anyone claiming to expect or predict it as suffering from hindsight bias.)
It is noted above that those with strong community attachment think that there is more risk of catastrophe. If human civilization collapses or is destroyed, then cryonics patients and facilities will also be destroyed.
I would expect the result to be a more accurate estimation of the success, combined with more sign-ups . 2 is an example of this if, in fact, the more accurate assessment is lower than the assessment of someone with a different level of information.
I don’t it’s true that everyone starts from “that won’t ever work”—we know some people think it might work, and we may be inclined to some wishful thinking or susceptability to hype to inflate our likelihood above the conclusion we’d reach if we invest the time to consider the issue in more depth, It’s also worth noting that we’re not comparing the general public to those who’ve seriously considered signing up, but the lesswrong population, who are probably a lot more exposed to the idea of cryonics.
I’d agree that it’s not what I would have predicted in advance (having no more expectation for the likelihood assigned to go up as down with more research), but it would be predictable for someone proceeding from the premise that the lesswrong community overestimates the likelihood of cryonics success compared to those who have done more research.
Yeah, I think you have a point. However, maybe the following explanation would be better:
Currently cyronics aren’t likely to work. People who sign up into cyronics do research on the subject before or after singing up, and hence become aware that cyronics aren’t likely to work.
Strength of membership in the LW community was related to responses for most of the questions. There were 3 questions related to strength of membership: karma, sequence reading, and time in the community, and since they were all correlated with each other and showed similar patterns I standardized them and averaged them together into a single measure. Then I checked if this measure of strength in membership in the LW community was related to answers on each of the other questions, for the 822 respondents (described in this comment) who answered at least one of the probability questions and used percentages rather than decimals (since I didn’t want to take the time to recode the answers which were given as decimals).
All effects described below have p < .01 (I also indicate when there is a nonsignificant trend with p<.2). On questions with categories I wasn’t that rigorous—if there was a significant effect overall I just eyeballed the differences and reported which categories have the clearest difference (and I skipped some of the background questions which had tons of different categories and are hard to interpret).
Compared to those with a less strong membership in the LW community, those with a strong tie to the community are:
Background:
Gender—no difference
Age—no difference
Relationship Status—no difference
Sexual Orientation—no difference
Relationship Style—less likely to prefer monogamous, more likely to prefer polyamorous or to have no preference
Political Views—less likely to be socialist, more likely to be libertarian (but this is driven by the length of time in the community, which may reflect changing demographics—see my reply to this comment)
Religious Views—more likely to be atheist & not spiritual, especially less likely to be agnostic
Family Religion—no difference
Moral Views—more likely to be consequentialist
IQ—higher
Probabilities:
Many Worlds—higher
Aliens in the universe—lower (edited: I had mistakenly reversed the two aliens questions)
Aliens in our galaxy—trend towards lower (p=.04)
Supernatural—lower
God—lower
Religion—trend towards lower (p=.11, and this is statistically significant with a different analysis)
Cryonics—lower
Anti-Agathics—trend towards higher (p=.13) (this was the one question with a significant non-monotonic relationship: those with a moderately strong tie to the community had the highest probability estimate, while those with weak or strong ties had lower estimates)
Simulation—trend towards higher (p=.20)
Global Warming—higher
No Catastrophe—lower (i.e., think it is less likely that we will make it to 2100 without a catastrophe, i.e. think the chances of xrisk are higher)
Other Questions:
Singularity—sooner (this is statistically significant after truncating the outliers), and more likely to give an estimate rather than leave it blank
Type of XRisk—more likely to think that Unfriendly AI is the most likely XRisk
Cryonics Status—More likely to be signed up or to be considering it, less likely to be not planning to or to not have thought about it
I looked at this one a little more closely, and this difference in political views is driven almost entirely by the “time in community” measure of strength of membership in the LW community; it’s not even statistically significant with the other two. I’d guess that is because LW started out on Overcoming Bias, which is a relatively libertarian blog, so the old timers tend to share those views. We’ve also probably added more non-Americans over time, who are more likely to be socialist.
All of the other relationships in the above post hold up when we replace the original measure of membership strength with one that is only based on the two variables of karma & sequence reading, but this one does not.
So long-time participants were less likely to believe that cryonics would work for them but more likely to sign up for it? Interesting. This could be driven by any of: fluke, greater rationality, greater age&income, less akrasia, more willingness to take long-shot bets based on shutting up and multiplying.
I looked into this a little more, and it looks like those who are strongly tied to the LW community are less likely to give high answers to p(cryonics) (p>50%), but not any more or less likely to give low answers (p<10%). That reduction in high answers could be a sign of greater rationality—less affect heuristic driven irrational exuberance about the prospects for cryonics—or just more knowledge about the topic. But I’m surprised that there’s no change in the frequency of low answers.
There is a similar pattern in the relationship between cryonics status and p(cryonics). Those who are signed up for cryonics don’t give a higher p(cryonics) on average than those who are not signed up, but they are less likely to give a probability under 10%. The group with the highest average p(cryonics) is those who aren’t signed up but are considering it, and that’s the group that’s most likely to give a probability over 50%.
Here are the results for p(cryonics) broken down by cryonics status, showing what percent of each group gave p(cryonics)<.1, what percent gave p(cryonics)>.5, and what the average p(cryonics) is for each group. (I’m expressing p(cryonics) here as probabilities from 0-1 because I think it’s easier to follow that way, since I’m giving the percent of people in each group.)
Never thought about it / don’t understand (n=26): 58% give p<.1, 8% give p>.5, mean p=.17
No, and not planning to (n=289): 60% give p<.1, 6% give p>.5, mean p=.14
No, but considering it (n=444): 38% give p < .1, 18% give p>.5, mean p=.27
Yes—signed up or just finishing up paperwork (n=36): 39% give p<.1, 8% give p>.5, mean p=.21
Overall: 47% give p<.1, 13% give p>.5, mean p=.22
The existential risk questions are a confounding factor here—if you think p(cryonics works) 80%, but p(xrisk ends civilization) 50%, that pulls down your p(successful revival) considerably.
I wondered about that, but p(cryonics) and p(xrisk) are actually uncorrelated, and the pattern of results for p(cryonics) remains the same when controlling statistically for p(xrisk).
I think the main reason for this is that these persons have simply spent more time thinking about cyronics compared to other people. By spending time on this forum they have had a good chance of running into a discussion which has inspired them to read about it and sign up. Or perhaps people who are interested in cyronics are also interested in other topics LW has to offer, and hence stay in this place. In either case, it follows that they are probably also more knowledgeable about cyronics and hence understand what cyrotechnology can realistically offer currently or in the near future. In addition, these long-time guys might be more open to things such as cyronics in the ethical way.
I don’t think this is obvious at all. If you had asked me before in advance which of the following 4 possible sign-pairs would be true with increasing time spent thinking about cryonics:
less credence, less sign-ups
less credence, more sign-ups
more credence, more sign-ups
more credence, less sign-ups
I would have said ‘obviously #3, since everyone starts from “that won’t ever work” and move up from there, and then one is that much more likely to sign up’
The actual outcome, #2, would be the one I would expect least. (Hence, I am strongly suspicious of anyone claiming to expect or predict it as suffering from hindsight bias.)
It is noted above that those with strong community attachment think that there is more risk of catastrophe. If human civilization collapses or is destroyed, then cryonics patients and facilities will also be destroyed.
I would expect the result to be a more accurate estimation of the success, combined with more sign-ups . 2 is an example of this if, in fact, the more accurate assessment is lower than the assessment of someone with a different level of information.
I don’t it’s true that everyone starts from “that won’t ever work”—we know some people think it might work, and we may be inclined to some wishful thinking or susceptability to hype to inflate our likelihood above the conclusion we’d reach if we invest the time to consider the issue in more depth, It’s also worth noting that we’re not comparing the general public to those who’ve seriously considered signing up, but the lesswrong population, who are probably a lot more exposed to the idea of cryonics.
I’d agree that it’s not what I would have predicted in advance (having no more expectation for the likelihood assigned to go up as down with more research), but it would be predictable for someone proceeding from the premise that the lesswrong community overestimates the likelihood of cryonics success compared to those who have done more research.
Yeah, I think you have a point. However, maybe the following explanation would be better: Currently cyronics aren’t likely to work. People who sign up into cyronics do research on the subject before or after singing up, and hence become aware that cyronics aren’t likely to work.