Even if I’m certain I will not defect, I am capable of asking what would happen if I did,
Yes, and part of the answer is “If I did defect, my clone would also defect.” You have a guarantee that both of you take the same actions because you think according to precisely identical reasoning.
Unclear, depends on the specific properties of the person being cloned. Unlike PD, the two players aren’t in the same situation, so they can’t necessarily rely on their logic being the same as their counterpart. How closely this would reflect the TDT ideal of ‘Always Push’ will depend on how luminous the person is; if they can model what they would do in the opposite situation, and are highly confident that their self-model is correct, they can reach the best result, but if they lack confidence that they know what they’d do, then the winning cooperation is harder to achieve.
Of course, if it’s denominated in money and is 100 steps of doubling, as implied by the Wikipedia page, then the difference in utility between $1 nonillion and $316 octillion is so negligible that there’s essentially no incentive to defect in the last round and any halfway-reasonable person will Always Push straight through the game. But that’s a degenerate case and probably not the version originally discussed.
Yes, and part of the answer is “If I did defect, my clone would also defect.” You have a guarantee that both of you take the same actions because you think according to precisely identical reasoning.
What do you think will happen if clones play the centipede game?
Unclear, depends on the specific properties of the person being cloned. Unlike PD, the two players aren’t in the same situation, so they can’t necessarily rely on their logic being the same as their counterpart. How closely this would reflect the TDT ideal of ‘Always Push’ will depend on how luminous the person is; if they can model what they would do in the opposite situation, and are highly confident that their self-model is correct, they can reach the best result, but if they lack confidence that they know what they’d do, then the winning cooperation is harder to achieve.
Of course, if it’s denominated in money and is 100 steps of doubling, as implied by the Wikipedia page, then the difference in utility between $1 nonillion and $316 octillion is so negligible that there’s essentially no incentive to defect in the last round and any halfway-reasonable person will Always Push straight through the game. But that’s a degenerate case and probably not the version originally discussed.