This has filled me with the distressing thought that the badness of death might somehow be diminished because of this
I find this statement disturbing. This reads as if you’d really like X to be as horrible as possible to justify your preexisting decision of fighting X.
I find this statement disturbing. This reads as if you’d really like X to be as horrible as possible to justify your preexisting decision of fighting X.
You’re right, I phrased that badly. Let me restate it in a more rigorous fashion:
I find certain moral arguments regarding the morality of making multiple versions of people to be plausible.
However, when I apply those arguments to scenarios involving MWI they seem to generate results that contradict basic, common sense morality (such as “don’t kill people”)
Either the moral arguments are wrong, common sense morality is wrong, or I am applying the arguments to MWI scenarios incorrectly.
Regardless of which of the above is correct, it means that at the moment I am extremely uncertain of very basic moral principles.
I am the sort of person who feels extreme distress at the prospect of uncertainty, especially moral uncertainty.
Therefore, thinking about these things makes me feel extreme distress.
So, to put in in a more succinct fashion, I don’t think my distress is caused by the Sunk Cost Fallacy. Rather, it is caused by an extreme dislike of moral and philosophical uncertainty. I can seen how you could interpret my initial post otherwise, however.
In general I tend to get very worried any time my explicitly reasoned moral beliefs contradict common sense morality, for reasons PhilGoetz explains better than I can. For every one of these contradictions that is genuine moral progress there are thousands of crazy errors. In this case I suspect I’m just applying MWI improperly, but the uncertainty is very distressing to me.
I find this statement disturbing. This reads as if you’d really like X to be as horrible as possible to justify your preexisting decision of fighting X.
status quo bias
You’re right, I phrased that badly. Let me restate it in a more rigorous fashion:
I find certain moral arguments regarding the morality of making multiple versions of people to be plausible.
However, when I apply those arguments to scenarios involving MWI they seem to generate results that contradict basic, common sense morality (such as “don’t kill people”)
Either the moral arguments are wrong, common sense morality is wrong, or I am applying the arguments to MWI scenarios incorrectly.
Regardless of which of the above is correct, it means that at the moment I am extremely uncertain of very basic moral principles.
I am the sort of person who feels extreme distress at the prospect of uncertainty, especially moral uncertainty.
Therefore, thinking about these things makes me feel extreme distress.
So, to put in in a more succinct fashion, I don’t think my distress is caused by the Sunk Cost Fallacy. Rather, it is caused by an extreme dislike of moral and philosophical uncertainty. I can seen how you could interpret my initial post otherwise, however.
In general I tend to get very worried any time my explicitly reasoned moral beliefs contradict common sense morality, for reasons PhilGoetz explains better than I can. For every one of these contradictions that is genuine moral progress there are thousands of crazy errors. In this case I suspect I’m just applying MWI improperly, but the uncertainty is very distressing to me.
nod
Are you nodding to “MWI meaning death isn’t bad is a warning sign” being as a result of status quo bias?