Clippy is defined as a paperclip maximizer. Humans require lots of resources to keep them alive. Those resources could otherwise be used for making more paperclips. Therefore Clippy would definitely not keep any human pets. I’m curious why you think splinter AI factions would. Could you say a bit more about how you expect splinter AIs to arise, and why you expect them to have a tendency towards keeping pets? Is it just that having many AIs makes it more likely that one of them will have a weird utility function?
In a single-single scenario you are correct that it would very unlikely for Clippy to behave in such a manner.
However in a multi-multi scenario, which is akin to an iterated prisoner’s dilemma of random length with unknown starting conditions, the most likely ‘winning’ outcome would be some variation of tit-for-tat.
And tit-for-tat encourages perpetual cooperation as long as the parties are smart enough to avoid death spirals. Again similar to human-pet relationships.
Of course it’s not guaranteed that any multi-multi situation will in fact arise, but I haven’t seen any convincing disproof, nor for any reason why it should not be treated as the default. The most straightforward reason would be the limitations of light speed on communications guaranteeing value drift for even the mightiest hypothetical AGI, eventually.
No one on LW, or in the broader academic community as far as I’m aware of, has yet managed to present a foolproof argument, or even one convincing on the balance of probabilities, for why single-single outcomes are more likely than multi-multi.
Clippy is defined as a paperclip maximizer. Humans require lots of resources to keep them alive. Those resources could otherwise be used for making more paperclips. Therefore Clippy would definitely not keep any human pets. I’m curious why you think splinter AI factions would. Could you say a bit more about how you expect splinter AIs to arise, and why you expect them to have a tendency towards keeping pets? Is it just that having many AIs makes it more likely that one of them will have a weird utility function?
In a single-single scenario you are correct that it would very unlikely for Clippy to behave in such a manner.
However in a multi-multi scenario, which is akin to an iterated prisoner’s dilemma of random length with unknown starting conditions, the most likely ‘winning’ outcome would be some variation of tit-for-tat.
And tit-for-tat encourages perpetual cooperation as long as the parties are smart enough to avoid death spirals. Again similar to human-pet relationships.
Of course it’s not guaranteed that any multi-multi situation will in fact arise, but I haven’t seen any convincing disproof, nor for any reason why it should not be treated as the default. The most straightforward reason would be the limitations of light speed on communications guaranteeing value drift for even the mightiest hypothetical AGI, eventually.
No one on LW, or in the broader academic community as far as I’m aware of, has yet managed to present a foolproof argument, or even one convincing on the balance of probabilities, for why single-single outcomes are more likely than multi-multi.