Human experience with desire satisfaction and “learning” and “growth” isn’t going to transfer over to how it is for paperclip maximizers
This is far from obvious. There are definitely people who claim “morality” is satisfying the preferences of as many agents as you can.
If morality evolved for game-theoretic reasons, there might even be something to this, although I personally think it’s too neat to endorse.
This is far from obvious. There are definitely people who claim “morality” is satisfying the preferences of as many agents as you can.
If morality evolved for game-theoretic reasons, there might even be something to this, although I personally think it’s too neat to endorse.