Do you have any particular reason for expecting it to be?
Or is this a more general “what if”? For example, if you contemplate moving to a foreign country, do you ask yourself what if your internal safety net is founded solely on living in the country you live in now?
I’m not Multiheaded, but it feels-as-if the part of brain that does math has no problem at all personally slaughtering a million people if it saves one million and ten (1); the ethical injunction against that, which is useful, feels-as-if it comes from “avoid the unpleasant (c.q. evil) thing”. (Weak evidence based on introspection, obviously.)
(1) Killing a million people is really unpleasant, but saving ten people should easily overcome that even if I care more about myself than about others.
Rougly that; I’ve thought about it in plenty more detail, but everything beyond this summary feels vague and I’m too lazy currently to make it coherent enough to post.
Do you have any particular reason for expecting it to be?
Or is this a more general “what if”? For example, if you contemplate moving to a foreign country, do you ask yourself what if your internal safety net is founded solely on living in the country you live in now?
I’m not Multiheaded, but it feels-as-if the part of brain that does math has no problem at all personally slaughtering a million people if it saves one million and ten (1); the ethical injunction against that, which is useful, feels-as-if it comes from “avoid the unpleasant (c.q. evil) thing”. (Weak evidence based on introspection, obviously.)
(1) Killing a million people is really unpleasant, but saving ten people should easily overcome that even if I care more about myself than about others.
Rougly that; I’ve thought about it in plenty more detail, but everything beyond this summary feels vague and I’m too lazy currently to make it coherent enough to post.
It feels like I do, but it’ll take a bit of very thoughtful writing to explicate why. So maybe I’ll explain it here later.