What about anthropics? Should we care more about the worlds where we exist?
EDIT: Wait, that’s nonsense.
I’d say “ridiculous”, not “nonsense”. An agent certainly could care about said worlds and not about others. Yvain has even expressed preferences along these lines himself and gone as far as to bite several related bullets. Yet while such preferences are logically coherent I would usually think it is more likely that someone professing them is confused about what they want.
I would usually think it is more likely that someone professing them is confused about what they want.
Indeed. I was thinking about subjective probabilities, without noticing that what I expect to observe isn’t what I expect to happen when dealing with anthropics.
I’d say “ridiculous”, not “nonsense”. An agent certainly could care about said worlds and not about others. Yvain has even expressed preferences along these lines himself and gone as far as to bite several related bullets. Yet while such preferences are logically coherent I would usually think it is more likely that someone professing them is confused about what they want.
Indeed. I was thinking about subjective probabilities, without noticing that what I expect to observe isn’t what I expect to happen when dealing with anthropics.
I was pretty tired …