The usage in Stuart’s posts on here just meant a certain way of calculating expected utilities. Selfish agents only used their own future utility when calculating expected utility, unselfish agents mixed in other peoples’ utilities. To make this a bit more robust to redefinition of what’s in your utility function, we could say that a purely selfish agent’s expected utility doesn’t change if actions stay the same but other peoples’ utilities change.
No one can mix another person’s actual utility function into their own. You can mix in your estimate of it. You can mix in your estimate of what you think it should be. But the actual utility function of another person is in that other person, and not in you.
In general, you can have anything in your utility function you please. I could care about the number of ducks in the pond near where I grew up, even though I can’t see it. And when I say caring about the number of ducks in the pond, I don’t just mean my perception of it—I don’t want to maximize how many ducks I think are in the pond, or I would just drug myself. However, you’re right that when calculating an “expected utility,” that is, your best guess at the time, you don’t usually have perfect information about other peoples’ utility functions, just like I wouldn’t have perfect information about the number of ducks in the pond, and so would have to use an estimate.
The reason it worked without this distinction in Stuart’s articles on the sleeping beauty problem was because the “other people” were actually copies of Sleeping Beauty, so you knew that their utility functions were the same.
No one can mix another person’s actual utility function into their own.
You can mix a pointer to it into your own. To see that this is different from mixing it your estimate, consider what you would do if you found out your estimate was mistaken.
The usage in Stuart’s posts on here just meant a certain way of calculating expected utilities. Selfish agents only used their own future utility when calculating expected utility, unselfish agents mixed in other peoples’ utilities. To make this a bit more robust to redefinition of what’s in your utility function, we could say that a purely selfish agent’s expected utility doesn’t change if actions stay the same but other peoples’ utilities change.
But this is all basically within option (2).
No one can mix another person’s actual utility function into their own. You can mix in your estimate of it. You can mix in your estimate of what you think it should be. But the actual utility function of another person is in that other person, and not in you.
Good point, if not totally right.
In general, you can have anything in your utility function you please. I could care about the number of ducks in the pond near where I grew up, even though I can’t see it. And when I say caring about the number of ducks in the pond, I don’t just mean my perception of it—I don’t want to maximize how many ducks I think are in the pond, or I would just drug myself. However, you’re right that when calculating an “expected utility,” that is, your best guess at the time, you don’t usually have perfect information about other peoples’ utility functions, just like I wouldn’t have perfect information about the number of ducks in the pond, and so would have to use an estimate.
The reason it worked without this distinction in Stuart’s articles on the sleeping beauty problem was because the “other people” were actually copies of Sleeping Beauty, so you knew that their utility functions were the same.
You can mix a pointer to it into your own. To see that this is different from mixing it your estimate, consider what you would do if you found out your estimate was mistaken.