To be clear, since humans are specified as valuing all agents (including sims of themselves and others) shouldn’t it be equivalent to Alice-who-values-copies-of-herself?
Sure, but the result you describe is equivalent to Alice being an average utilitarian with respect to copies of herself. What if Alice is a total utilitarian with respect to copies of herself?
Sure, but the result you describe is equivalent to Alice being an average utilitarian with respect to copies of herself. What if Alice is a total utilitarian with respect to copies of herself?
Actually, she should still make the same choice, although she would choose differently in other scenarios.