I’m not Multiheaded, but it feels-as-if the part of brain that does math has no problem at all personally slaughtering a million people if it saves one million and ten (1); the ethical injunction against that, which is useful, feels-as-if it comes from “avoid the unpleasant (c.q. evil) thing”. (Weak evidence based on introspection, obviously.)
(1) Killing a million people is really unpleasant, but saving ten people should easily overcome that even if I care more about myself than about others.
Rougly that; I’ve thought about it in plenty more detail, but everything beyond this summary feels vague and I’m too lazy currently to make it coherent enough to post.
I’m not Multiheaded, but it feels-as-if the part of brain that does math has no problem at all personally slaughtering a million people if it saves one million and ten (1); the ethical injunction against that, which is useful, feels-as-if it comes from “avoid the unpleasant (c.q. evil) thing”. (Weak evidence based on introspection, obviously.)
(1) Killing a million people is really unpleasant, but saving ten people should easily overcome that even if I care more about myself than about others.
Rougly that; I’ve thought about it in plenty more detail, but everything beyond this summary feels vague and I’m too lazy currently to make it coherent enough to post.