I don’t know, do you like chocolate? If yes, does that fact pay rent? Our preferences about happiness of observers vs. number of observers are part of what needs to be encoded into FAI’s utility function. So we need to figure them out, with thought experiments if we have to.
As to objective vs personal truth, I think anthropic probabilities aren’t much different from regular probabilities in that sense. Seeing a quantum coin come up heads half the time is the same kind of “personal truth” as getting anthropic evidence in the game I described. Either way there will be many copies of you seeing different things and you need to figure out the weighting.
I don’t know, do you like chocolate? If yes, does that fact pay rent? Our preferences about happiness of observers vs. number of observers are part of what needs to be encoded into FAI’s utility function. So we need to figure them out, with thought experiments if we have to.
As to objective vs personal truth, I think anthropic probabilities aren’t much different from regular probabilities in that sense. Seeing a quantum coin come up heads half the time is the same kind of “personal truth” as getting anthropic evidence in the game I described. Either way there will be many copies of you seeing different things and you need to figure out the weighting.