Assume I am the type of person who would always cooperate with my clone. If I asked myself the following question “If I defected would my payoff be higher or lower than if I cooperated even though I know I will always cooperate” what would be the answer?
The answer would be ‘MOO’. Or ‘Mu’, or ‘moot’; they’re equivalent. “In this impossible counterfactual where I am self-contradictory, what would happen?”
Yes, it makes a little bit of sense to counterfactually reason that you would get $1000 more if you defected, but that is predicated on the assumption that you always cooperate.
You cannot actually get that free $1000 because the underlying assumption of the counterfactual would be violated if you actually defected.
Assume I am the type of person who would always cooperate with my clone. If I asked myself the following question “If I defected would my payoff be higher or lower than if I cooperated even though I know I will always cooperate” what would be the answer?
The answer would be ‘MOO’. Or ‘Mu’, or ‘moot’; they’re equivalent. “In this impossible counterfactual where I am self-contradictory, what would happen?”
Yes, it makes a little bit of sense to counterfactually reason that you would get $1000 more if you defected, but that is predicated on the assumption that you always cooperate. You cannot actually get that free $1000 because the underlying assumption of the counterfactual would be violated if you actually defected.