Assuming friendly AI is possible for a civilization like ours to develop, during every hour that such a civilization exists there is epsilon chance that it will be developed. Add 3^^^3 or so of those epsilons up, and you eventually get a pretty good chance.
BTW, the original post is why Michael Vassar called quantum immortality the most horrifying idea he’s ever had to take seriously. I’m hoping for something like Hanson’s Mangled Worlds.
Assuming friendly AI is possible for a civilization like ours to develop, during every hour that such a civilization exists there is epsilon chance that it will be developed. Add 3^^^3 or so of those epsilons up, and you eventually get a pretty good chance.
BTW, the original post is why Michael Vassar called quantum immortality the most horrifying idea he’s ever had to take seriously. I’m hoping for something like Hanson’s Mangled Worlds.