Just because AGIs have capability to inflict infinite torture, doesn’t mean they have a motive. Also, status quo (with regard to SIAI’s activity) doesn’t involve nothing new happening.
I explained that he is planning to supply one with a possible motive (namely that the CEV of humanity might hate me or people like me). It is precisely because of this that the problem arises. A paperclipper, or any other AGI whose utility function had nothing to do with humanity’s wishes, would have far less motive to do this—it might kill me, but it really would have no motive to torture me.
Just because AGIs have capability to inflict infinite torture, doesn’t mean they have a motive. Also, status quo (with regard to SIAI’s activity) doesn’t involve nothing new happening.
I explained that he is planning to supply one with a possible motive (namely that the CEV of humanity might hate me or people like me). It is precisely because of this that the problem arises. A paperclipper, or any other AGI whose utility function had nothing to do with humanity’s wishes, would have far less motive to do this—it might kill me, but it really would have no motive to torture me.