have a perception-determined utility function (like AIXI)
OK, so, I read the reference—but I still don’t know what you mean.
All agents have “perception-determined utility functions”—in the sense that I describe here.
The post you linked is not about whether utility functions are “determined by perceptions”, but rather about whether they are based on extrapolated future selves, or extrapolated future environments.
OK, so, I read the reference—but I still don’t know what you mean.
All agents have “perception-determined utility functions”—in the sense that I describe here.
The post you linked is not about whether utility functions are “determined by perceptions”, but rather about whether they are based on extrapolated future selves, or extrapolated future environments.
There’s no disagreement here, you’re just confused about semantics.
Er, that is unnecessarily hostile and insulting nonsense. Please think before you post public insults.
I genuinely do not understand how you were insulted by my comment. Could you please explain it to me so I can avoid in the future.
Note, I am not purposely insulting you in this comment.