I would like to think that I would cooperate reasonably with my copies, especially when there is a strong reason to prioritize global values over selfish values.
However, in practice I would also expect that System 1 would still see copies as separate but related individuals rather than as myself, and this would limit the amount of cooperation that occurs. I might have to engage in some self-deceptive reasoning to accomplish selfishness, but the human brain is good at that (“I’ve been working harder than my copies—I deserve a little extra!”)
I would like to think that I would cooperate reasonably with my copies, especially when there is a strong reason to prioritize global values over selfish values.
However, in practice I would also expect that System 1 would still see copies as separate but related individuals rather than as myself, and this would limit the amount of cooperation that occurs. I might have to engage in some self-deceptive reasoning to accomplish selfishness, but the human brain is good at that (“I’ve been working harder than my copies—I deserve a little extra!”)