I agree with Phil; all else equal I’d rather have whatever takes over be sentient. The moment to pause is when you make something that takes over, not so much when you wonder if it should be sentient as well.
Er, yes? Feelings are evolution’s way of playing carrot-and-stick with the brain. You really do not want to have an AI that needs spanking, whether it’s you or a emotion module that does it: it’s apt to delete the spanker and award itself infinite cake.
Can you summarize your reasons, stipulating that we really want to give over control to a super-powerful intelligence at all, for why we should want it to have feelings?
I agree with Phil; all else equal I’d rather have whatever takes over be sentient. The moment to pause is when you make something that takes over, not so much when you wonder if it should be sentient as well.
Yeah, do we really want to give over control to a super-powerful intelligence that DOESN’T have feelings?
Er, yes? Feelings are evolution’s way of playing carrot-and-stick with the brain. You really do not want to have an AI that needs spanking, whether it’s you or a emotion module that does it: it’s apt to delete the spanker and award itself infinite cake.
Can you summarize your reasons, stipulating that we really want to give over control to a super-powerful intelligence at all, for why we should want it to have feelings?