If you believe in the many worlds interpretation of quantum mechanics, you have to discount the utility of each of your future selves by his measure, instead of treating them all equally. The obvious generalization of this idea is for the altruist to discount the utility he assigns to other people by their measures, instead of treating them all equally.
But instead of using the QM measure (which doesn’t make sense “outside the Matrix”), let the measure of each person be inversely related to his algorithmic complexity (his personal algorithmic complexity, which is equal to the algorithmic complexity of his universe plus the amount of information needed to locate him within that universe), and the problem is solved. The utility of a Turing machine can no longer grow much faster than its prior probability shrinks, since the sum of measures of people computed by a Turing machine can’t be larger than its prior probability.
If you believe in the many worlds interpretation of quantum mechanics, you have to discount the utility of each of your future selves by his measure, instead of treating them all equally. The obvious generalization of this idea is for the altruist to discount the utility he assigns to other people by their measures, instead of treating them all equally.
But instead of using the QM measure (which doesn’t make sense “outside the Matrix”), let the measure of each person be inversely related to his algorithmic complexity (his personal algorithmic complexity, which is equal to the algorithmic complexity of his universe plus the amount of information needed to locate him within that universe), and the problem is solved. The utility of a Turing machine can no longer grow much faster than its prior probability shrinks, since the sum of measures of people computed by a Turing machine can’t be larger than its prior probability.
But there is another puzzle/paradox with Solomonoff induction that I don’t know how to solve. I’ve written about it at http://groups.google.com/group/everything-list/browse_frm/thread/c7442c13ff1396ec/. Eliezer, do you think it would be suitable for a blog post here?