a fixed-goal AGI is bad… Which is indeed correct, but… irrelevant? It has the best EV by its own metric?
Nobody knows how to formulate it like that! EV maximization is so entrenched as obviously the thing to do that the “obviously, it’s just EV maximization for something else” response is instinctual, but that doesn’t seem to be the case.
And if maximization is always cursed (goals are always proxy goals, even as they become increasingly more accurate, particularly around the actual environment), it’s not maximization that decision theory should be concerned with.
Nobody knows how to formulate it like that! EV maximization is so entrenched as obviously the thing to do that the “obviously, it’s just EV maximization for something else” response is instinctual, but that doesn’t seem to be the case.
And if maximization is always cursed (goals are always proxy goals, even as they become increasingly more accurate, particularly around the actual environment), it’s not maximization that decision theory should be concerned with.