Everything is possible, but not everything has the same measure (is equally likely). Killing someone in 10% of “worlds” is worse than killing them in 1% of “worlds”.
At the end, believing in many worlds will give you the same results as believing in collapse. It’s just that epistemologically, the believer in collapse needs to deal with the problem of luck. Does “having a 10% probability of killing someone, and actually killing them” make you a worse person that “having a 10% probability of killing someone, but not killing them”?
(From many-worlds perspective, it’s the same. You simply shouldn’t do things that have 10% risk of killing someone, unless it is to avoid even worse things.)
(And yes, there is the technical problem of how exactly you determine that the probability was exactly 10%, considering that you don’t see the parallel “words”.)
Everything is possible, but not everything has the same measure (is equally likely). Killing someone in 10% of “worlds” is worse than killing them in 1% of “worlds”.
Apart from the other problem: MWI is deterministic, so you can’t alter the percentages by any kind of free will, despite what people keep asserting.
Does “having a 10% probability of killing someone, and actually killing them” make you a worse person that “having a 10% probability of killing someone, but not killing them”?
Actually killing them is certainly worse. We place moral weight on actions as well as character.
MWI is deterministic, so you can’t alter the percentages by any kind of free will, despite what people keep asserting.
Neither most collapse-theories nor MWI allow for super-physical free will, so that doesn’t seem relevant to this question. Since the question concerns what one should do, it seems reasonable to assume that some notion of choice is possible.
(FWIW, I’d guess compatibilism is the most popular take on free will on LW.)
Everything is possible, but not everything has the same measure (is equally likely). Killing someone in 10% of “worlds” is worse than killing them in 1% of “worlds”.
At the end, believing in many worlds will give you the same results as believing in collapse. It’s just that epistemologically, the believer in collapse needs to deal with the problem of luck. Does “having a 10% probability of killing someone, and actually killing them” make you a worse person that “having a 10% probability of killing someone, but not killing them”?
(From many-worlds perspective, it’s the same. You simply shouldn’t do things that have 10% risk of killing someone, unless it is to avoid even worse things.)
(And yes, there is the technical problem of how exactly you determine that the probability was exactly 10%, considering that you don’t see the parallel “words”.)
Apart from the other problem: MWI is deterministic, so you can’t alter the percentages by any kind of free will, despite what people keep asserting.
Actually killing them is certainly worse. We place moral weight on actions as well as character.
Neither most collapse-theories nor MWI allow for super-physical free will, so that doesn’t seem relevant to this question. Since the question concerns what one should do, it seems reasonable to assume that some notion of choice is possible.
(FWIW, I’d guess compatibilism is the most popular take on free will on LW.)
Yes, but compatibilism doesn’t suggest that you choose between different actions or between different decision theories.
Wait, what? If compatibilism doesn’t suggest that I’m choosing between actions, what am I choosing between?
Theories, imaginary ideas.