Agreed on that last point particularly. Especially since, if they want similar enough things, they could easily cooperate without trade.
Like if two AIs supported Alice in her role as Queen of Examplestan, they would probably figure that quibbling with each other over whether Bob the gardener should have one or two buttons undone (just on the basis of fashion, not due to larger consequences) is not a good use of their time.
Also, the utility functions can differ as much as you want on matters aren’t going to come up. Like, Agents A and B disagree on how awful many bad things are. Both agree that they are all really quite bad and all effort should be put forth to prevent them.
Agreed on that last point particularly. Especially since, if they want similar enough things, they could easily cooperate without trade.
Like if two AIs supported Alice in her role as Queen of Examplestan, they would probably figure that quibbling with each other over whether Bob the gardener should have one or two buttons undone (just on the basis of fashion, not due to larger consequences) is not a good use of their time.
Also, the utility functions can differ as much as you want on matters aren’t going to come up. Like, Agents A and B disagree on how awful many bad things are. Both agree that they are all really quite bad and all effort should be put forth to prevent them.