It can choose to not cooperate, but it will only do so if thats what it wants to do. The genie I have described wants to cooperate. An AGI of any of the forms I have described would want to cooperate. Now you can claim that I can’t build an AGI with any easily controlled utility function at all, but this is very much a harder claim.
It can choose to not cooperate, but it will only do so if thats what it wants to do. The genie I have described wants to cooperate. An AGI of any of the forms I have described would want to cooperate. Now you can claim that I can’t build an AGI with any easily controlled utility function at all, but this is very much a harder claim.