I think this is a confusion. Game theory is only meaningful after you specified the utility functions of the players. If these utility functions don’t already include caring about other agents, the result is not what I’d call “morality”, it is just cooperation between selfish entities. Surely the evolutionary reasons for morality have to do with cooperative game theory, so what? The evolutionary reason for sex is reproduction, it doesn’t mean we shouldn’t be doing sex with condoms. Morality should not be derived from anything except human brains.
I think this disagreement is purely a matter of semantics: ‘morality’ is an umbrella term which is often used to cover several distinct concepts, such as empathy, group allegiance and cooperation. In this case, the AI would be moral according to one dimension of morality, but not the others.
Also, cooperation seems to be at least a large component of morality, while some believe morality should be derived entirely from game theory.
I think this is a confusion. Game theory is only meaningful after you specified the utility functions of the players. If these utility functions don’t already include caring about other agents, the result is not what I’d call “morality”, it is just cooperation between selfish entities. Surely the evolutionary reasons for morality have to do with cooperative game theory, so what? The evolutionary reason for sex is reproduction, it doesn’t mean we shouldn’t be doing sex with condoms. Morality should not be derived from anything except human brains.
I think this disagreement is purely a matter of semantics: ‘morality’ is an umbrella term which is often used to cover several distinct concepts, such as empathy, group allegiance and cooperation. In this case, the AI would be moral according to one dimension of morality, but not the others.