I had to read Yvain and then piece together a bunch of missing parts to figure out what exactly you meant. You’re assuming not just many worlds, but an infinite number of worlds, or at least a large enough number of them that every possible variation relying on our laws of physics would be included. This, for starters, is a huge assumption.
If I don’t get cryonics in this universe, another me in another universe will. By reducing the number of possible universes I might be in, I increase the possibility that I am living in as close to the optimal universe as possible, as long as I don’t eliminate the best possible optimum. If I’m on the show Deal or No Deal, as long as I don’t get rid of the million dollar case, any case I get rid of improves my odds of getting the million dollar case.
However, it’s not clear that helping a bunch of random strangers in my universe is any better than helping a bunch of random strangers in another universe. You’re presuming a set of beliefs in which I am an EA within my universe, but an egoist towards other universes. If I don’t distinguish between a man living in China and a man living in China on Earth 2, I should get cryonics.
You’re assuming not just many worlds, but an infinite number of worlds, or at least a large enough number of them that every possible variation relying on our laws of physics would be included. This, for starters, is a huge assumption.
If our universe is spatially infinite, that should be enough.
One of my reasons for believing in multiverses is the anthropic argument from fine-tuning, which would seem to offer a large enough range for this to be relevant.
If I don’t get cryonics in this universe, another me in another universe will. By reducing the number of possible universes I might be in, I increase the possibility that I am living in as close to the optimal universe as possible, as long as I don’t eliminate the best possible optimum. If I’m on the show Deal or No Deal, as long as I don’t get rid of the million dollar case, any case I get rid of improves my odds of getting the million dollar case.
This seems correct.
However, it’s not clear that helping a bunch of random strangers in my universe is any better than helping a bunch of random strangers in another universe. You’re presuming a set of beliefs in which I am an EA within my universe, but an egoist towards other universes. If I don’t distinguish between a man living in China and a man living in China on Earth 2, I should get cryonics.
I did note that it depends on utility functions, so my “presumptions” were explicit. Even my limited argument still implies that selfish people shouldn’t sign up, while I’m sure some people who have signed up identify as selfish. I also think that altruism towards only your world is a position many people would agree with. It’s at least not clear that it’s not right.
I’m thinking now that it may not matter after all. My argument is that for any person Y, U(Y|Y gets cryonics) is lower than (U(Y|Y doesn’t get cryonics). Their personal utility is higher regardless of utility function, so only outside considerations matter here. But outside considerations may matter only to those who expect to have high impact on the rest of the world, and even then, it’s hard to see how much value you could really have in those worlds that you would be willing to sacrifice your own utility for it.
As I said before:
Then, it would depend on how much you expect yourself to be worth to others in the worlds where you survive only if you take cryonics, and how exactly you weigh worlds where you don’t exist.
This really needs a full theory of anthropics (and metaethics), which this margin is too narrow to contain. Just wanted to get the idea out there.
I had to read Yvain and then piece together a bunch of missing parts to figure out what exactly you meant. You’re assuming not just many worlds, but an infinite number of worlds, or at least a large enough number of them that every possible variation relying on our laws of physics would be included. This, for starters, is a huge assumption.
If I don’t get cryonics in this universe, another me in another universe will. By reducing the number of possible universes I might be in, I increase the possibility that I am living in as close to the optimal universe as possible, as long as I don’t eliminate the best possible optimum. If I’m on the show Deal or No Deal, as long as I don’t get rid of the million dollar case, any case I get rid of improves my odds of getting the million dollar case.
However, it’s not clear that helping a bunch of random strangers in my universe is any better than helping a bunch of random strangers in another universe. You’re presuming a set of beliefs in which I am an EA within my universe, but an egoist towards other universes. If I don’t distinguish between a man living in China and a man living in China on Earth 2, I should get cryonics.
If our universe is spatially infinite, that should be enough.
One of my reasons for believing in multiverses is the anthropic argument from fine-tuning, which would seem to offer a large enough range for this to be relevant.
This seems correct.
I did note that it depends on utility functions, so my “presumptions” were explicit. Even my limited argument still implies that selfish people shouldn’t sign up, while I’m sure some people who have signed up identify as selfish. I also think that altruism towards only your world is a position many people would agree with. It’s at least not clear that it’s not right.
I’m thinking now that it may not matter after all. My argument is that for any person Y, U(Y|Y gets cryonics) is lower than (U(Y|Y doesn’t get cryonics). Their personal utility is higher regardless of utility function, so only outside considerations matter here. But outside considerations may matter only to those who expect to have high impact on the rest of the world, and even then, it’s hard to see how much value you could really have in those worlds that you would be willing to sacrifice your own utility for it.
As I said before:
This really needs a full theory of anthropics (and metaethics), which this margin is too narrow to contain. Just wanted to get the idea out there.