Maybe you’re hyperbolically discounting that future pleasure and it’s outweighed by the temporary displeasure caused by agreeing to something abhorrent? ;)
I think that if an FAI scanned ArisKatsaris’ brain, extrapolated values from that, and then was instructed to extrapolate what a non-hyperboli- discounting ArisKatsaris would choose, it would answer that ArisKatsaris would not choose to get rewired to receive pleasure from the end of mankind.
Of course, there’s no way to test such a hypothesis.
I think that if an FAI scanned ArisKatsaris’ brain, extrapolated values from that, and then was instructed to extrapolate what a non-hyperboli- discounting ArisKatsaris would choose, it would answer that ArisKatsaris would not choose to get rewired to receive pleasure from the end of mankind.
Of course, there’s no way to test such a hypothesis.