it means it is far too strict to require that our decisions all cause a future benefit; we need to count acausal “consequences” (SAMELs) on par with causal ones (CaMELs)
OK, so this may be a completely stupid question, as I’m a total newbie to decision theoryish issues… but couldn’t you work non-zero weighting of SAMELs into a decision theory, without abandoning consequentialism, by reformulating “causality” in an MWIish, anthropic kind of way in which you say that an action is causally linked to a consequence if it increases the number of worlds in which the consequence exists? Then SAMELs become not-really-acausal, and you could view winning at PH as simply maximising the number of worlds in which your utility it maximised.
I’m probably making some basic errors of terminology (and reasoning) here, as I’m operating well outside of my comfort zone on this subject matter, so if I’m wrong, or not making any sense, please be gentle in explaining why.
I think my (at the time misdirected) comment here is most responsive to your question. In short, causality has a narrow, technical definition here, which corresponds with wide (but not universal) usage. I see nothing wrong with regarding SAMELs as consequences, or saying that e.g. one-boxing causes the sealed box to be filled, but this is incorrect for standard game-theoretic usage of the terms.
OK, so this may be a completely stupid question, as I’m a total newbie to decision theoryish issues… but couldn’t you work non-zero weighting of SAMELs into a decision theory, without abandoning consequentialism, by reformulating “causality” in an MWIish, anthropic kind of way in which you say that an action is causally linked to a consequence if it increases the number of worlds in which the consequence exists? Then SAMELs become not-really-acausal, and you could view winning at PH as simply maximising the number of worlds in which your utility it maximised.
I’m probably making some basic errors of terminology (and reasoning) here, as I’m operating well outside of my comfort zone on this subject matter, so if I’m wrong, or not making any sense, please be gentle in explaining why.
I think my (at the time misdirected) comment here is most responsive to your question. In short, causality has a narrow, technical definition here, which corresponds with wide (but not universal) usage. I see nothing wrong with regarding SAMELs as consequences, or saying that e.g. one-boxing causes the sealed box to be filled, but this is incorrect for standard game-theoretic usage of the terms.