In a prisoners’ dilemma, especially one that fails the conditions needed for cooperation to be rational, everyone’s current utility function would be better-fulfilled in the future if everyone’s future self had a different utility function.
You might be able to fulfil your current utility function even better in the future if you fool everyone else into changing their utility functions, and don’t change yours. But if this can be detected, and you will thus be excluded from society, it’s better, even by your current utility function, to adopt a new utility function. (You can tell yourself that you’re just precomitting to act differently. But I expect the most effective methods of precommitment will look just like methods for changing your utility function.)
This is in the ‘normal ending’ of the babyeater story I linked to.
I read that story. I guess I was confused by the ending in which an entire populated star system was destroyed to prevent everyone having their utility function changed.
I guess he might be referring to the Superhappies’ self-modification? I don’t know, I’m still struggling to understand how someone could read “True Ending: Sacrificial Fire” and think Bad End.
That’s what happens when you take “thou shalt not modify thine own utility function” as a religious tenet.
Wrong, the “religious tenet” is not “thou shalt not modify thine own utility function”, but “minimize evil”. Of course, being changed to no longer care about minimizing evil is not conducive to minimizing evil.
In a prisoners’ dilemma, especially one that fails the conditions needed for cooperation to be rational, everyone’s current utility function would be better-fulfilled in the future if everyone’s future self had a different utility function.
You might be able to fulfil your current utility function even better in the future if you fool everyone else into changing their utility functions, and don’t change yours. But if this can be detected, and you will thus be excluded from society, it’s better, even by your current utility function, to adopt a new utility function. (You can tell yourself that you’re just precomitting to act differently. But I expect the most effective methods of precommitment will look just like methods for changing your utility function.)
This is in the ‘normal ending’ of the babyeater story I linked to.
I read that story. I guess I was confused by the ending in which an entire populated star system was destroyed to prevent everyone having their utility function changed.
I guess he might be referring to the Superhappies’ self-modification? I don’t know, I’m still struggling to understand how someone could read “True Ending: Sacrificial Fire” and think Bad End.
That’s what happens when you take “thou shalt not modify thine own utility function” as a religious tenet.
Wrong, the “religious tenet” is not “thou shalt not modify thine own utility function”, but “minimize evil”. Of course, being changed to no longer care about minimizing evil is not conducive to minimizing evil.