My question was not meant to be interpreted literally but was rather instrumental in highlighting the idea of what if it is more likely that maximizing utility not only fails but rather it turns out that the overall utility is minimized, i.e. the amount of suffering increasing. Instrumentally, isn’t it better to believe that winning is impossible, than that it’s likely, if the actual probability is very low?
To decide to lose intentionally, I need to know how much it costs to try to win, what the odds of success are, and what the difference in utility is if I win.
I feel like people weigh those factors unconsciously and automatically (using bounded resources and rarely with perfect knowledge or accuracy).
My question was not meant to be interpreted literally but was rather instrumental in highlighting the idea of what if it is more likely that maximizing utility not only fails but rather it turns out that the overall utility is minimized, i.e. the amount of suffering increasing. Instrumentally, isn’t it better to believe that winning is impossible, than that it’s likely, if the actual probability is very low?
To decide to lose intentionally, I need to know how much it costs to try to win, what the odds of success are, and what the difference in utility is if I win.
I feel like people weigh those factors unconsciously and automatically (using bounded resources and rarely with perfect knowledge or accuracy).