E(f) needn’t have anything to do with W. But it has to do with the f-values of all the different versions of you that there might be in the future.
I think this is part of where things are going wrong. f values aren’t things that each future version of me has. They are especially not the value that a particular future me places on a given outcome, or the preferences of that particular future me. f values are simply mathematical constructs built to formalize the choices that current me happens to make over gambles.
Despite its name, expected utility maximization is not actually averaging over the preferences of future me-s; it’s just averaging a more-or-less arbitrary function, that may or may not have anything to do with those preferences.
I think this is part of where things are going wrong. f values aren’t things that each future version of me has. They are especially not the value that a particular future me places on a given outcome, or the preferences of that particular future me. f values are simply mathematical constructs built to formalize the choices that current me happens to make over gambles.
Despite its name, expected utility maximization is not actually averaging over the preferences of future me-s; it’s just averaging a more-or-less arbitrary function, that may or may not have anything to do with those preferences.