This is problematic, in that it depends a lot on what I know about the person I’m playing with. If it’s a total stranger I’ll probably defect, if it’s a copy of me or someone that I think is committed to superrationality I’ll probably cooperate.
Also, just “True Prisoner’s Dilemma” is pretty vague—the actual rewards and penalties matter. I’m a lot more inclined to cooperate if the cost of my opponent defecting is “lose this game of Diplomacy” rather than “be tortured for 50 years”.
Ah, I’d forgotten that one. That one’s problematic as well, though, given that different people will value “2 billion lives saved” differently. Even ignoring issues of scope insensitivity when we’re talking about numbers on this scale, for sufficiently selfish people the notion of saving lives would actually be less of a dilemma than if there was a personal cost to themselves. Or negative utilitarians might consider it a good thing if there were fewer humans on Earth, though you could possibly fix this by specifying that the disease kills slowly and painfully and causes more suffering than if the people lived normal lives, or something.
Yes, it depends entirely on who you’re playing against. If it’s a rock you obviously defect, if it’s a copy of yourself you obviously cooperate, and then at some point between those two it switches from one to the other. True Prisoner’s Dilemma is underspecified.
This is problematic, in that it depends a lot on what I know about the person I’m playing with. If it’s a total stranger I’ll probably defect, if it’s a copy of me or someone that I think is committed to superrationality I’ll probably cooperate.
Also, just “True Prisoner’s Dilemma” is pretty vague—the actual rewards and penalties matter. I’m a lot more inclined to cooperate if the cost of my opponent defecting is “lose this game of Diplomacy” rather than “be tortured for 50 years”.
I had these same objections, but I assumed he was referencing this particular formalization.
Ah, I’d forgotten that one. That one’s problematic as well, though, given that different people will value “2 billion lives saved” differently. Even ignoring issues of scope insensitivity when we’re talking about numbers on this scale, for sufficiently selfish people the notion of saving lives would actually be less of a dilemma than if there was a personal cost to themselves. Or negative utilitarians might consider it a good thing if there were fewer humans on Earth, though you could possibly fix this by specifying that the disease kills slowly and painfully and causes more suffering than if the people lived normal lives, or something.
Make the 2 billion lives saved into 2000 lives saved plus two kicks in the groin.
Yes, it depends entirely on who you’re playing against. If it’s a rock you obviously defect, if it’s a copy of yourself you obviously cooperate, and then at some point between those two it switches from one to the other. True Prisoner’s Dilemma is underspecified.