The argument is that given an Oracle and an entity of limited intelligence that has goal G, we can construct a superintelligent being with goal G by having the limited intelligence ask the Oracle how to achieve G.
But it still might not be possible, in which case the Oracle will not be of help. That scenario only removes difficulties due to limited intelligence on the builders part.
Negotiating with your copies is the much easier version of negotiating with other people.
I don’t have any copies I can interact with, so how can it be easy?
I still don’t see the big problem with MR. In other conversations, people have put it to me that MR is impossilbe because it is impossible to completely satisfy everyone’s preferecnces. It is impossible to completely satisfy everyone’s preferecnces, but that is not soemthing MR requires. It is kind of obvious thar morality in genral requires compromises and sacrifices, since we see that happening all the time. in the real world.
But it still might not be possible, in which case the Oracle will not be of help. That scenario only removes difficulties due to limited intelligence on the builders part.
I don’t have any copies I can interact with, so how can it be easy?
I still don’t see the big problem with MR. In other conversations, people have put it to me that MR is impossilbe because it is impossible to completely satisfy everyone’s preferecnces. It is impossible to completely satisfy everyone’s preferecnces, but that is not soemthing MR requires. It is kind of obvious thar morality in genral requires compromises and sacrifices, since we see that happening all the time. in the real world.