I was surprised you’d give the human race so low a chance as a coin-flip.
I think the reason Roko puts it so low—and at the probability one gives when one doesn’t know either way—is because there are so many current existential risks like nuclear warfare, pandemics, asteroids, supervolcanoes, and there are multiple existential risks which could become a concern between now and when cryo-resurrection is feasible (grey goo-level nanotech is a plausible prerequisite for resurrection, and equally plausible candidate for destroying humanity, to say nothing of AI risks).
I think the reason Roko puts it so low—and at the probability one gives when one doesn’t know either way—is because there are so many current existential risks like nuclear warfare, pandemics, asteroids, supervolcanoes, and there are multiple existential risks which could become a concern between now and when cryo-resurrection is feasible (grey goo-level nanotech is a plausible prerequisite for resurrection, and equally plausible candidate for destroying humanity, to say nothing of AI risks).