It’s only a loop in imaginary Platonia. In the real world, laws of physics don’t notice that there’s a “loop”. One way to see the problem is as a situation that demonstrates failure to adequately account for the real world with the semantics usually employed to think about it.
If it’s a loop in Platonia, then all causation happens in Platonia. If any causation can be said to happen in the real world, then real causation is happening backwards in time in the Newcomb scenario.
But I, for one, have no problem with that. All causal processes observed so far have run in the same temporal direction. But there’s no reason to rule out a priori the possibility of exceptions.
I don’t see why Newcombe’s paradox breaks causality—it seems more accurate to say that both events are caused by an earlier cause: your predisposition
to choose a particular way. Both Omega’s prediction and your action are caused by this predisposition, meaning Omega’s prediction is merely correlated with, not a cause of, your choice.
It’s commonplace for an event A to cause an event B, with both sharing a third antecedent cause C. (The bullet’s firing causes the prisoner to die, but the finger’s pulling of the trigger causes both.) Newcomb’s scenario has the added wrinkle that event B also causes event A. Nonetheless, both still have the antecedent cause C that you describe.
All of this only makes sense under the right analysis of causation. In this case, the right analysis is a manipulationist one, such as that given by Judea Pearl.
Newcomb’s scenario has the added wrinkle that event B also causes event A
I don’t see how. Omega doesn’t make the prediction because you made the action—he makes it because he can predict that a person of a particular mental configuration at time T will make decision A at time T+1. If I were to play the part of Omega, I couldn’t achieve perfect prediction, but might be able to achieve, say, 90% by studying what people say they will do on blogs about Newcombe’s paradox, and performing observation as to what such people actually do (so long as my decision criteria weren’t known to the person I was testing).
Am I violating causality by doing this? Clearly not—my prediction is caused by the blog post and my observations, not by the action. The same thing that causes you to say you’d decide one way is also what causes you to act one way. As I get better and better, nothing changes, nor do I see why something would if I am able to simulate you perfectly, achieving 100% accuracy (some degree of determinism is assumed there, but then it’s already in the original thought experiment if we assume literally 100% accuracy).
Assuming I’m understanding it correctly, the same would be true for a manipulationist definition. If we can manipulate your mental state, we’d change both the prediction (assuming Omega factors in this manipulation) and the decision, thus your mental state is a cause of both. However if we could manipulate your action without changing the state that causes it in a way that would affect Omega’s prediction, our actions would not change the prediction. In practice, this may be impossible (it requires Omega not to factor in our manipulation, which is contradicted by assuming he is a perfect predictor), but in principle it seems valid.
It’s only a loop in imaginary Platonia. In the real world, laws of physics don’t notice that there’s a “loop”. One way to see the problem is as a situation that demonstrates failure to adequately account for the real world with the semantics usually employed to think about it.
Too opaque.
Alas, yes. I’m working on that.
If it’s a loop in Platonia, then all causation happens in Platonia. If any causation can be said to happen in the real world, then real causation is happening backwards in time in the Newcomb scenario.
But I, for one, have no problem with that. All causal processes observed so far have run in the same temporal direction. But there’s no reason to rule out a priori the possibility of exceptions.
ETA: Nor to rule out loops.
I don’t see why Newcombe’s paradox breaks causality—it seems more accurate to say that both events are caused by an earlier cause: your predisposition to choose a particular way. Both Omega’s prediction and your action are caused by this predisposition, meaning Omega’s prediction is merely correlated with, not a cause of, your choice.
It’s commonplace for an event A to cause an event B, with both sharing a third antecedent cause C. (The bullet’s firing causes the prisoner to die, but the finger’s pulling of the trigger causes both.) Newcomb’s scenario has the added wrinkle that event B also causes event A. Nonetheless, both still have the antecedent cause C that you describe.
All of this only makes sense under the right analysis of causation. In this case, the right analysis is a manipulationist one, such as that given by Judea Pearl.
I don’t see how. Omega doesn’t make the prediction because you made the action—he makes it because he can predict that a person of a particular mental configuration at time T will make decision A at time T+1. If I were to play the part of Omega, I couldn’t achieve perfect prediction, but might be able to achieve, say, 90% by studying what people say they will do on blogs about Newcombe’s paradox, and performing observation as to what such people actually do (so long as my decision criteria weren’t known to the person I was testing).
Am I violating causality by doing this? Clearly not—my prediction is caused by the blog post and my observations, not by the action. The same thing that causes you to say you’d decide one way is also what causes you to act one way. As I get better and better, nothing changes, nor do I see why something would if I am able to simulate you perfectly, achieving 100% accuracy (some degree of determinism is assumed there, but then it’s already in the original thought experiment if we assume literally 100% accuracy).
Assuming I’m understanding it correctly, the same would be true for a manipulationist definition. If we can manipulate your mental state, we’d change both the prediction (assuming Omega factors in this manipulation) and the decision, thus your mental state is a cause of both. However if we could manipulate your action without changing the state that causes it in a way that would affect Omega’s prediction, our actions would not change the prediction. In practice, this may be impossible (it requires Omega not to factor in our manipulation, which is contradicted by assuming he is a perfect predictor), but in principle it seems valid.