I think that Robin’s point solves this problem, but doesn’t solve the more general problem of an AGI’s reaction to low probability high utility possibilities and the attendant problems of non-convergence.
The guy with the button could threaten to make an extra-planar factory farm containing 3^^^^^3 pigs instead of killing 3^^^^3 humans. If utilities are additive, that would be worse.
I think that Robin’s point solves this problem, but doesn’t solve the more general problem of an AGI’s reaction to low probability high utility possibilities and the attendant problems of non-convergence.
The guy with the button could threaten to make an extra-planar factory farm containing 3^^^^^3 pigs instead of killing 3^^^^3 humans. If utilities are additive, that would be worse.