Eliezer> Is the value of my existence steadily shrinking as the universe expands and it requires more information to locate me in space?
Yes, but the value of everyone else’s existence is shrinking by the same factor, so it doesn’t disturb the preference ordering among possible courses of actions, as far as I can see.
Eliezer> If I make a large uniquely structured arrow pointing at myself from orbit so that a very simple Turing machine can scan the universe and locate me, does the value of my existence go up?
This is a more serious problem for my proposal, but the conspicuous arrow also increases the values of everyone near you by almost the same factor, so again perhaps it doesn’t make as much difference as you expect.
Eliezer> I am skeptical that this solution makes moral sense, however convenient it might be as a patch to this particular problem.
I’m also skeptical, but I’d say it’s more than just a patch to this particular problem. Treating everyone as equals no matter what their measures are, besides leading to counterintuitive results in this “Pascal’s Mugging” thought experiment, is not even mathematically sound, since the sum of the small probabilities multiplied by the vast utilities do not converge to any finite value, no matter what course of action you choose.
The mathematics says that you have to discount each person’s value by some function, otherwise your expected utilities won’t converge. The only question is which function. Using the inverse of a person’s algorithmic complexity seems to lead to intuitive results in many situations, but not all.
But I’m also open to the possibility that this entire approach is wrong… Are there other proposed solutions that make more sense to you at the moment?
Eliezer> Is the value of my existence steadily shrinking as the universe expands and it requires more information to locate me in space?
Yes, but the value of everyone else’s existence is shrinking by the same factor, so it doesn’t disturb the preference ordering among possible courses of actions, as far as I can see.
Eliezer> If I make a large uniquely structured arrow pointing at myself from orbit so that a very simple Turing machine can scan the universe and locate me, does the value of my existence go up?
This is a more serious problem for my proposal, but the conspicuous arrow also increases the values of everyone near you by almost the same factor, so again perhaps it doesn’t make as much difference as you expect.
Eliezer> I am skeptical that this solution makes moral sense, however convenient it might be as a patch to this particular problem.
I’m also skeptical, but I’d say it’s more than just a patch to this particular problem. Treating everyone as equals no matter what their measures are, besides leading to counterintuitive results in this “Pascal’s Mugging” thought experiment, is not even mathematically sound, since the sum of the small probabilities multiplied by the vast utilities do not converge to any finite value, no matter what course of action you choose.
The mathematics says that you have to discount each person’s value by some function, otherwise your expected utilities won’t converge. The only question is which function. Using the inverse of a person’s algorithmic complexity seems to lead to intuitive results in many situations, but not all.
But I’m also open to the possibility that this entire approach is wrong… Are there other proposed solutions that make more sense to you at the moment?