Many of the examples are not like Newcomb’s problem, but are just rephrased to use timeless decision-theory-esque language. To be like newcomb’s problem you don’t just want TDT to work, you want causal decision theory to not work, i.e. you want the problem to be decision-determined but not action-determined. This is more exotic, but may be approximated by situations where other people are trying to figure out your real feelings.
CDT doesn’t statistically win in these examples: in all cases, if you reason only from what your actions cause you will be in a world where communication is statistically harder, transaction costs are statistically higher, etc. Newcomb’s Problem only differs in the certainty of this relationship.
Many of the examples are not like Newcomb’s problem, but are just rephrased to use timeless decision-theory-esque language. To be like newcomb’s problem you don’t just want TDT to work, you want causal decision theory to not work, i.e. you want the problem to be decision-determined but not action-determined. This is more exotic, but may be approximated by situations where other people are trying to figure out your real feelings.
CDT doesn’t statistically win in these examples: in all cases, if you reason only from what your actions cause you will be in a world where communication is statistically harder, transaction costs are statistically higher, etc. Newcomb’s Problem only differs in the certainty of this relationship.