It turns out we can’t. Imagine that, without changing anything else, the utility of X7 is suddenly set to ten trillion, rather than 7. The OP of X7 is still 3 - it’s still the best option, still with probability 1⁄8.
I think a problem with this line of attack is that you are mixing preferences and utilities. You could imagine two types of optimization power. A preference-centric one and a utility-centric one, both of which can be useful depending what you’re talking about. You can map preferences to utilities and utilities to preferences, but one may be more natural than the other for your purposes.
I think a problem with this line of attack is that you are mixing preferences and utilities. You could imagine two types of optimization power. A preference-centric one and a utility-centric one, both of which can be useful depending what you’re talking about. You can map preferences to utilities and utilities to preferences, but one may be more natural than the other for your purposes.