As for decision theory, I think that the “logical” counterfactuals should supplement, not supplant, the physical counterfactuals.
I understand that many people share this opinion, but I don’t see much in the way of justification. What arguments do you have, besides intuition? I see no reason to expect the “correct” decision theory to be intuitive. Game theory, for example, isn’t.
“Logical” counterfactuals either mean those whose antecedents are logically impossible, or those whose consequents follow from the antecedents by logical necessity, or both. In your Newcomb’s-problem-solving algorithm example, both features apply.
But there are many decisions (or sub-problems within decisions) where neither feature applies. Where the agent is a human being without access to its “source code”, it commits no contradiction in supposing “if I did A… but if I did B …” even though at most one of these lies in the future that causality is heading toward. Additionally, one may be unable to prove that the expected utility of A would be U1 and the expected utility of B would be U2, even though one may reasonably believe both of these. Even when we know most of the relevant physical laws, we lack knowledge of relevant initial or boundary conditions, and if we had those we’d still lack the time to do the calculations.
Our knowledge of physical counterfactuals isn’t deductive, usually, but it’s still knowledge. And we still need to know the physical consequences of various actions—so we’ll have to go on using these counterfactuals. Of course, I just used another intuition there! I’m not concerned, though—decision theory and game theory will both be evaluated on a balance of intuitions. That, of course, does not mean that every intuitively appealing norm will be accepted. But those that are rejected will lose out to a more intuitively appealing (combination of) norm(s).
In a previous comment, I said that physical counterfactuals follow trivially from physical laws, giving an ideal-gas example. But now I just said that our knowledge of physical counterfactuals is usually non-deductive. Problem? No: in the previous comment I remarked on the relation between truths in attempting to show that counterfactuals needn’t be metaphysical monsters. In this comment I have been focusing on the agent’s epistemic situation. We can know to a moral certainty that a causal relationship holds without being able to state the laws involved and without being able to make any inference that strictly deserves the label “deduction”.
Benja’s got it: I’m interested in physical counterfactuals. They are the type that is involved in the everyday notion of what a person “could” do.
As for decision theory, I think that the “logical” counterfactuals should supplement, not supplant, the physical counterfactuals.
I understand that many people share this opinion, but I don’t see much in the way of justification. What arguments do you have, besides intuition? I see no reason to expect the “correct” decision theory to be intuitive. Game theory, for example, isn’t.
“Logical” counterfactuals either mean those whose antecedents are logically impossible, or those whose consequents follow from the antecedents by logical necessity, or both. In your Newcomb’s-problem-solving algorithm example, both features apply.
But there are many decisions (or sub-problems within decisions) where neither feature applies. Where the agent is a human being without access to its “source code”, it commits no contradiction in supposing “if I did A… but if I did B …” even though at most one of these lies in the future that causality is heading toward. Additionally, one may be unable to prove that the expected utility of A would be U1 and the expected utility of B would be U2, even though one may reasonably believe both of these. Even when we know most of the relevant physical laws, we lack knowledge of relevant initial or boundary conditions, and if we had those we’d still lack the time to do the calculations.
Our knowledge of physical counterfactuals isn’t deductive, usually, but it’s still knowledge. And we still need to know the physical consequences of various actions—so we’ll have to go on using these counterfactuals. Of course, I just used another intuition there! I’m not concerned, though—decision theory and game theory will both be evaluated on a balance of intuitions. That, of course, does not mean that every intuitively appealing norm will be accepted. But those that are rejected will lose out to a more intuitively appealing (combination of) norm(s).
In a previous comment, I said that physical counterfactuals follow trivially from physical laws, giving an ideal-gas example. But now I just said that our knowledge of physical counterfactuals is usually non-deductive. Problem? No: in the previous comment I remarked on the relation between truths in attempting to show that counterfactuals needn’t be metaphysical monsters. In this comment I have been focusing on the agent’s epistemic situation. We can know to a moral certainty that a causal relationship holds without being able to state the laws involved and without being able to make any inference that strictly deserves the label “deduction”.