prisoner dilemma between identical instances of your decision procedure (that’s what Newcomb’s problem is)
I’m not so sure. The output of your decision procedure is the same as the output of Omega’s prediction procedure, but that doesn’t tell you how algorithmically similar they are.
Well, if you are to do causal decision theory, you must also be in a causal world (or at least assume you are in causal world), and in the causal world, correlation of omega’s decisions with yours implies either coincidence or some causation—either omega’s choices cause your choices, your choices cause omega’s choices, or there is a common cause to both yours and omega’s choices. The common cause could be the decision procedure, could be the childhood event that makes a person adopt the decision procedure, etc. In the latter case, it’s not even a question of decision theory. The choice of box been already made—by chance, or by parents, or by someone who convinced you to one/two box. From that point on, it has been mechanistically propagating by laws of physics, and affected both omega and you. (and even before that point, it has been mechanistically propagating ever since big bang).
The huge problem about application of decision theory is the idea of immaterial soul that’s doing the deciding however it wishes. That’s not how things are. There are causes to the decisions. Using causal decision theory together with the idea of immaterial soul that’s deciding from outside of the causal universe, leads to a fairly inconsistent world.
I’m not so sure. The output of your decision procedure is the same as the output of Omega’s prediction procedure, but that doesn’t tell you how algorithmically similar they are.
Well, if you are to do causal decision theory, you must also be in a causal world (or at least assume you are in causal world), and in the causal world, correlation of omega’s decisions with yours implies either coincidence or some causation—either omega’s choices cause your choices, your choices cause omega’s choices, or there is a common cause to both yours and omega’s choices. The common cause could be the decision procedure, could be the childhood event that makes a person adopt the decision procedure, etc. In the latter case, it’s not even a question of decision theory. The choice of box been already made—by chance, or by parents, or by someone who convinced you to one/two box. From that point on, it has been mechanistically propagating by laws of physics, and affected both omega and you. (and even before that point, it has been mechanistically propagating ever since big bang).
The huge problem about application of decision theory is the idea of immaterial soul that’s doing the deciding however it wishes. That’s not how things are. There are causes to the decisions. Using causal decision theory together with the idea of immaterial soul that’s deciding from outside of the causal universe, leads to a fairly inconsistent world.