The idea is that they have the same utility function, but the utility function takes values over anthropic states (values of “I”).
U(I am X and X chooses sim) = 1U(I am Xi and Xi chooses sim) = 0.2 etc.
I don’t like it, but I also don’t see an obvious way to reject the idea.
The idea is that they have the same utility function, but the utility function takes values over anthropic states (values of “I”).
U(I am X and X chooses sim) = 1
U(I am Xi and Xi chooses sim) = 0.2 etc.
I don’t like it, but I also don’t see an obvious way to reject the idea.