I was surprised you’d give the human race so low a chance as a coin-flip.
I wouldn’t even give “wiped out badly enough to set us back half a century” that high a chance. The whatever-it-is would have to hit at least 3 continents with devastating force to cause such a severe civilizational reversal. (Example: global warming won’t do it. People on the rich continents will just up and relocate.)
I was surprised you’d give the human race so low a chance as a coin-flip.
I think the reason Roko puts it so low—and at the probability one gives when one doesn’t know either way—is because there are so many current existential risks like nuclear warfare, pandemics, asteroids, supervolcanoes, and there are multiple existential risks which could become a concern between now and when cryo-resurrection is feasible (grey goo-level nanotech is a plausible prerequisite for resurrection, and equally plausible candidate for destroying humanity, to say nothing of AI risks).
I’ve actually had people email me and tell me off for putting our chances of survival too high. They cite the AGI risk as contributing most of the risk.
But we can at least defer to the judgment of experts. Those who completed the survey circulated at the Global Catastrophic Risks conference at Oxford gave humanity a mean 79% probability of surviving this century. The most pessimistic estimate I know is that of Sir Martin Rees, who believes we have a fifty-fifty chance of making it through, though I was told that in private conversation Rees gives a much lower figure.
These seem to have been mostly experts on a single risk, not experts on the future in general. There is no field of study called “the future” that you can be an expert in. I’d consider people’s opinion on the future more authoritative the more they 1) were smart, 2) had a track record of dealing rationally with “big questions” (anyone with an anti-silly bias is out), 3) had a lot of knowledge about different proposed technologies, 4) had a background in a lot of different scientific fields as well as in philosophy, 5) had clearly grasped what seem to me to be the important points that people have been making about certain unmentionables.
I’m not one of the people who emailed Roko, but yes, I seriously think 50% is an overestimate. Note that this probability is not only for humans not getting extinct, but for civilization not collapsing. One of the reasons why we might survive beyond 2100 is if certain advanced techs are much harder than we thought or even impossible, and in that case cryonicists are probably screwed too.
Yes—it would have been more useful to categorise the risks, get an estimate from experts in that risk on the probability of our survival, and multiply those risks together.
I was surprised you’d give the human race so low a chance as a coin-flip.
I wouldn’t even give “wiped out badly enough to set us back half a century” that high a chance. The whatever-it-is would have to hit at least 3 continents with devastating force to cause such a severe civilizational reversal. (Example: global warming won’t do it. People on the rich continents will just up and relocate.)
I think the reason Roko puts it so low—and at the probability one gives when one doesn’t know either way—is because there are so many current existential risks like nuclear warfare, pandemics, asteroids, supervolcanoes, and there are multiple existential risks which could become a concern between now and when cryo-resurrection is feasible (grey goo-level nanotech is a plausible prerequisite for resurrection, and equally plausible candidate for destroying humanity, to say nothing of AI risks).
I’ve actually had people email me and tell me off for putting our chances of survival too high. They cite the AGI risk as contributing most of the risk.
But we can’t talk about this at the moment.
But we can at least defer to the judgment of experts. Those who completed the survey circulated at the Global Catastrophic Risks conference at Oxford gave humanity a mean 79% probability of surviving this century. The most pessimistic estimate I know is that of Sir Martin Rees, who believes we have a fifty-fifty chance of making it through, though I was told that in private conversation Rees gives a much lower figure.
These seem to have been mostly experts on a single risk, not experts on the future in general. There is no field of study called “the future” that you can be an expert in. I’d consider people’s opinion on the future more authoritative the more they 1) were smart, 2) had a track record of dealing rationally with “big questions” (anyone with an anti-silly bias is out), 3) had a lot of knowledge about different proposed technologies, 4) had a background in a lot of different scientific fields as well as in philosophy, 5) had clearly grasped what seem to me to be the important points that people have been making about certain unmentionables.
I’m not one of the people who emailed Roko, but yes, I seriously think 50% is an overestimate. Note that this probability is not only for humans not getting extinct, but for civilization not collapsing. One of the reasons why we might survive beyond 2100 is if certain advanced techs are much harder than we thought or even impossible, and in that case cryonicists are probably screwed too.
Yes—it would have been more useful to categorise the risks, get an estimate from experts in that risk on the probability of our survival, and multiply those risks together.