I think that the more general problem is that if the absolute value of the utility that you attach to a world-state increases faster than does its complexity decreases given the current situation then the very possibility of that world-state existing will cause it to hijack the entirety of your utility function
Yes, that’s exactly the problem.
Of course, utility functions are not constructed to avoid this problem
Well, they had better be, or they will fall victim to it.
You have to choose one of the following: (1) Pascal’s Mugging; (2) Scope Insensitivity (bounding utility by improbability); or (3) Wishful Thinking (bounding improbability by utility).
I think that the more general problem is that if the absolute value of the utility that you attach to a world-state increases faster than does its complexity decreases given the current situation then the very possibility of that world-state existing will cause it to hijack the entirety of your utility function
Yes, that’s exactly the problem.
We often call such things a ‘problem’ yet by very definition it is exactly how it should be. If your utility function genuinely represents your preferences (including preferences with respect to risk) then rejoice in the opportunity to devote all your resources to the possibility in question! If it doesn’t then the only ‘problem’ is that your ‘utility function’, well, isn’t your actual utility function. It’s the same problem that you get when you think you like carrots when you really like peaches.
Voluntary dedication is not ‘hijacking’.
(Response primarily directed to quoted text and only a response to the parent in as much as it follows the problem frame.)
If it doesn’t then the only ‘problem’ is that your ‘utility function’, well, isn’t your actual utility function. It’s the same problem that you get when you think you like carrots when you really like peaches.
Yes, that’s exactly the problem.
Well, they had better be, or they will fall victim to it.
You have to choose one of the following: (1) Pascal’s Mugging; (2) Scope Insensitivity (bounding utility by improbability); or (3) Wishful Thinking (bounding improbability by utility).
We often call such things a ‘problem’ yet by very definition it is exactly how it should be. If your utility function genuinely represents your preferences (including preferences with respect to risk) then rejoice in the opportunity to devote all your resources to the possibility in question! If it doesn’t then the only ‘problem’ is that your ‘utility function’, well, isn’t your actual utility function. It’s the same problem that you get when you think you like carrots when you really like peaches.
Voluntary dedication is not ‘hijacking’.
(Response primarily directed to quoted text and only a response to the parent in as much as it follows the problem frame.)
Agreed.
Our heuristics hijack our volition?