Maybe I don’t have to know that other skewed dilemmas are in fact happening. Maybe I just have to know that they could be happening. Or that they could have happened. Maybe it’s enough to know a coin was flipped to determine in whose favor the dilemma is skewed, for example.
What evidence do you have to believe things are balanced? All you know is that one skewed situation exists. What evidence leads you to believe that other situations exist that are skewed relatively equally in the opposite direction? It’s irrational to end up with the worst possible outcome for a PD because there might, in theory, be other PDs in which if you opponent did what you did you would benefit.
For what I think is an completely unexaggerated analogy: It is theoretically possible that every time I eat a banana, some entity horribly tortures an innocent person. It could happen. Absent any actual evidence that it does, my banana consumption will not change. You should not change your behaviour in a PD because it’s theoretically possible that other PDs exist with oppositely skewed outcomes.
As for the counterfactual mugging, Yvain will never do it, unless he’s an eccentric millionaire, because he’d lose a fortune. For any other individual, you would need substantial evidence before you would trust them.
As for precommitment, the lack of an ability to credibly precommit is one of the essential elements of a prisoner’s dilemma. If the prisoners could make an enforceable contract not to snitch, it’d be easy to end up at the optimal outcome.
What evidence do you have to believe things are balanced?
What evidence do you have to believe that things are 1) unbalanced 2) in your favor?
You don’t know what kinds of PD’s you’re going to encounter, so you prepare for all of them by setting up the appropriate precommitments, if your decision theory requires precommitments. If it doesn’t, you’ll just figure out and do the thing that you would have wanted to precommit to doing, “on the fly”.
Credibility is indeed assumed in these problems. If you can’t verify that the other player really has made the precommitment or really is a UDT kind of guy, you can’t take advantage of this kind of coordination.
What evidence do you have to believe things are balanced? All you know is that one skewed situation exists. What evidence leads you to believe that other situations exist that are skewed relatively equally in the opposite direction? It’s irrational to end up with the worst possible outcome for a PD because there might, in theory, be other PDs in which if you opponent did what you did you would benefit.
For what I think is an completely unexaggerated analogy: It is theoretically possible that every time I eat a banana, some entity horribly tortures an innocent person. It could happen. Absent any actual evidence that it does, my banana consumption will not change. You should not change your behaviour in a PD because it’s theoretically possible that other PDs exist with oppositely skewed outcomes.
As for the counterfactual mugging, Yvain will never do it, unless he’s an eccentric millionaire, because he’d lose a fortune. For any other individual, you would need substantial evidence before you would trust them.
As for precommitment, the lack of an ability to credibly precommit is one of the essential elements of a prisoner’s dilemma. If the prisoners could make an enforceable contract not to snitch, it’d be easy to end up at the optimal outcome.
What evidence do you have to believe that things are 1) unbalanced 2) in your favor?
You don’t know what kinds of PD’s you’re going to encounter, so you prepare for all of them by setting up the appropriate precommitments, if your decision theory requires precommitments. If it doesn’t, you’ll just figure out and do the thing that you would have wanted to precommit to doing, “on the fly”.
Credibility is indeed assumed in these problems. If you can’t verify that the other player really has made the precommitment or really is a UDT kind of guy, you can’t take advantage of this kind of coordination.