Humans tend to highly value their own personal experiences—getting to do things that feel fun, acquiring personal status, putting intrinsic value on their own survival, etc. This limits the extent to which they’ll co-operate with a group, since their own interests and the interests of the group are only partially the same. AIs with less personal interests would be better incentivized to coordinate—if you only care about the amount of paperclips in the universe, you will be able to better further that goal with others than if each AI was instead optimizing for the amount of paperclips that they personally got to produce.
Some academics argue that religion etc. evolved for the purpose of suppressing personal interests and giving them a common impersonal goal, partially getting around this problem. I discussed this and its connection to these matters a bit in my old post Intelligence Explosion vs. Co-operative Explosion.
Humans tend to highly value their own personal experiences—getting to do things that feel fun, acquiring personal status, putting intrinsic value on their own survival, etc. This limits the extent to which they’ll co-operate with a group, since their own interests and the interests of the group are only partially the same. AIs with less personal interests would be better incentivized to coordinate—if you only care about the amount of paperclips in the universe, you will be able to better further that goal with others than if each AI was instead optimizing for the amount of paperclips that they personally got to produce.
Some academics argue that religion etc. evolved for the purpose of suppressing personal interests and giving them a common impersonal goal, partially getting around this problem. I discussed this and its connection to these matters a bit in my old post Intelligence Explosion vs. Co-operative Explosion.