QI predicts not the different variants of the world, but different variants of my future experiences. It says that I will not experience “no existence”, but will experience my most probable survival way. If I have a chance to survive 1 in 1000 in some situation, QI shifts probability that I will experience survival up to 1.
But it could fail in unpredictable ways: if we are in the simulation, and my plane crashes, the next my experience will be probably screen with title “game over”, not experience of me alive on the ground.
I agree with what you said in brackets about cryonics. I also think that investing in cryonics will help to promote it and all other good things, so it doesn’t contradict with my regrettable costs. I think that one rational way of action is make a will where one gives all his money to cryocompany. (It also depends of existence and well being of children, and other useful charities, which could prevent x-risks, so it may need more complex consideration.)
QI predicts not the different variants of the world, but different variants of my future experiences. It says that I will not experience “no existence”, but will experience my most probable survival way. If I have a chance to survive 1 in 1000 in some situation, QI shifts probability that I will experience survival up to 1.
But it could fail in unpredictable ways: if we are in the simulation, and my plane crashes, the next my experience will be probably screen with title “game over”, not experience of me alive on the ground.
I agree with what you said in brackets about cryonics. I also think that investing in cryonics will help to promote it and all other good things, so it doesn’t contradict with my regrettable costs. I think that one rational way of action is make a will where one gives all his money to cryocompany. (It also depends of existence and well being of children, and other useful charities, which could prevent x-risks, so it may need more complex consideration.)