QI makes cryonics more sensible, because it rises share of the world where I am 120 and not terminally ill compare to share of the worlds where I am 120 and terminally ill.
QI doesn’t matter much in “altruistic” decision making systems, if I care for other people measure of wellbeing (but consider the fact that if I inform them about QI they may be less worried about death, so even here it could play)
QI is more important if I value as most important thing my future pleasures and sufferings.
Anyway QI is more about facts about future observations, but not about decision theories. QI does not tell us which DT is better. But it could make pressure on the agent to choose more egoistic DT as it will be more rewarded.
QI makes cryonics more sensible, because it rises share of the world where I am 120 and not terminally ill compare to share of the worlds where I am 120 and terminally ill.
QI doesn’t matter much in “altruistic” decision making systems, if I care for other people measure of wellbeing (but consider the fact that if I inform them about QI they may be less worried about death, so even here it could play)
QI is more important if I value as most important thing my future pleasures and sufferings.
Anyway QI is more about facts about future observations, but not about decision theories. QI does not tell us which DT is better. But it could make pressure on the agent to choose more egoistic DT as it will be more rewarded.
I’m not altogether certain that it will make them less worried.