I’m confused about the section about pleasure. Isn’t the problem with paperclip maximizers that if they’re capable of feeling pleasure at all, they’ll feel it only while making paperclips?
If they conclude that pleasure is worth experiencing, they’d self modify to feel more pleasure. Also, if we’re turned into paperclips but it makes the AI sufficiently happy, that seems good, if you agree with what I argue in the linked post.
I’m confused about the section about pleasure. Isn’t the problem with paperclip maximizers that if they’re capable of feeling pleasure at all, they’ll feel it only while making paperclips?
If they conclude that pleasure is worth experiencing, they’d self modify to feel more pleasure. Also, if we’re turned into paperclips but it makes the AI sufficiently happy, that seems good, if you agree with what I argue in the linked post.