In the case of probability of ordinary observations, I think you can assign probabilities if your preferences over possible strategies satisfy some conditions, the major one being what you prefer to happen in one branch has to be independent of what you prefer to happen in another branch, i.e., the Axiom of Independence. If we ignore counterfactual-mugging type considerations, do you see any problems with this? If so can you give an example?
This is exactly the difference that allows to have the 6-element state space, as in the example with indexical uncertainty above, instead of more general 9-element state space. You place the possibilities in one branch by the side with the possibilities in the other branch, instead of considering all possible combinations of possibilities. It’s easy to represent various situations for which you assign probability as alternatives, lying side by side in the state space: the alternatives in different possible worlds, or counterfactuals, as they never “interact”, seem to be right to model by just considering as options, independently. The same for two physical systems that don’t interact with each other: what’s the difference between that and being in different possible worlds? - And a special case of this situation is indexical uncertainty. One condition for doing it without problem is independence. But independence isn’t really true, it’s approximation.
It’s trivial to set up the situations equivalent to counterfactual mugging, if the participants are computer programs that don’t run very far. It’s possible to prove things about where a program can go, and perform actions depending on the conclusion. What do you do then? I don’t know yet, your comment brought the idea of meaninglessness of probability of ordinary observations just yesterday, before that I didn’t notice this issue. Maybe I’ll finally find a situation where prior+utility isn’t an adequate way of representing preference, or maybe there is a good way of lifting probability of observations to probability of strategies.
In the case of probability of ordinary observations, I think you can assign probabilities if your preferences over possible strategies satisfy some conditions, the major one being what you prefer to happen in one branch has to be independent of what you prefer to happen in another branch, i.e., the Axiom of Independence. If we ignore counterfactual-mugging type considerations, do you see any problems with this? If so can you give an example?
This is exactly the difference that allows to have the 6-element state space, as in the example with indexical uncertainty above, instead of more general 9-element state space. You place the possibilities in one branch by the side with the possibilities in the other branch, instead of considering all possible combinations of possibilities. It’s easy to represent various situations for which you assign probability as alternatives, lying side by side in the state space: the alternatives in different possible worlds, or counterfactuals, as they never “interact”, seem to be right to model by just considering as options, independently. The same for two physical systems that don’t interact with each other: what’s the difference between that and being in different possible worlds? - And a special case of this situation is indexical uncertainty. One condition for doing it without problem is independence. But independence isn’t really true, it’s approximation.
It’s trivial to set up the situations equivalent to counterfactual mugging, if the participants are computer programs that don’t run very far. It’s possible to prove things about where a program can go, and perform actions depending on the conclusion. What do you do then? I don’t know yet, your comment brought the idea of meaninglessness of probability of ordinary observations just yesterday, before that I didn’t notice this issue. Maybe I’ll finally find a situation where prior+utility isn’t an adequate way of representing preference, or maybe there is a good way of lifting probability of observations to probability of strategies.