I think you’re hinting at things like the expanding moral circle. And according to that, there’s no reason that I should care more about people in my universe than people in other universes. I think this makes sense when saying whether I should care. But the analogy with “caring about people in a third world country on the other side of the world” breaks down when we consider our means to influence these other universes. Being able to influence the Solomonoff prior seems like a very indirect way to alter another universe, on which I have very little information. That’s different from buying Malaria nets.
So even if you’re altruistic, I doubt that “other universes” would be high in your priority list.
The best argument I can find for why you would want to influence the prior is if it is a way to influence the simulation of your own universe, à la gradient hacking.
I personally see no fundamental difference between direct and indirect ways of influence, except in so far as they relate to stuff like expected value.
I agree that given the amount expected influence, other universes are not high on my priority list, but they are still on my priority list. I expect the same for consequentialists in other universes. I also expect consequentialist beings that control most of their universe to get around to most of the things on their priority list, hence I expect them to influence the Solmonoff prior.
Okay, it’s probably subtler than that.
I think you’re hinting at things like the expanding moral circle. And according to that, there’s no reason that I should care more about people in my universe than people in other universes. I think this makes sense when saying whether I should care. But the analogy with “caring about people in a third world country on the other side of the world” breaks down when we consider our means to influence these other universes. Being able to influence the Solomonoff prior seems like a very indirect way to alter another universe, on which I have very little information. That’s different from buying Malaria nets.
So even if you’re altruistic, I doubt that “other universes” would be high in your priority list.
The best argument I can find for why you would want to influence the prior is if it is a way to influence the simulation of your own universe, à la gradient hacking.
I personally see no fundamental difference between direct and indirect ways of influence, except in so far as they relate to stuff like expected value.
I agree that given the amount expected influence, other universes are not high on my priority list, but they are still on my priority list. I expect the same for consequentialists in other universes. I also expect consequentialist beings that control most of their universe to get around to most of the things on their priority list, hence I expect them to influence the Solmonoff prior.