No, if random person wants to sacrifice their life for the greater good, then I have no objection.
I would, however, suggest that they are lacking somewhat in humanity. There is such a thing as being altruistic beyond the human norm, and this is an example of it.
If I knew someone was capable of this, I wouldn’t want them as a friend or partner. Who knows when they might make one utilitarian calculation too many and kill us both?
Perhaps I am paranoid about this because… I used to be like that.
I would, however, suggest that they are lacking somewhat in humanity. There is such a thing as being altruistic beyond the human norm, and this is an example of it.
If I knew someone was capable of this, I wouldn’t want them as a friend or partner. Who knows when they might make one utilitarian calculation too many and kill us both?
What if the friend shared the same core values as you? If my friend had the same core value as me (e.g. it is worth killing two people to save a billion people from eternal torture), and were utilitarian, then perhaps I’d be “ok”[1] with my friend making “one utilitarian calculation too many” and killing both of us.
1: By “ok”, I guess I mean I’d probably be very upset during those final moments where I’m dying, and then my consciousness would cease, my final thoughts to be damning my friend. But if I allow myself to imagine an after-life, I could see eventually (weeks after my death? months?) eventually grudgingly coming to accept that his/her choice was probably the rational one, and agreeing that (s)he “did the right thing”.
No, if random person wants to sacrifice their life for the greater good, then I have no objection.
I would, however, suggest that they are lacking somewhat in humanity. There is such a thing as being altruistic beyond the human norm, and this is an example of it.
If I knew someone was capable of this, I wouldn’t want them as a friend or partner. Who knows when they might make one utilitarian calculation too many and kill us both?
Perhaps I am paranoid about this because… I used to be like that.
Reminds me of one of the 101 Zen Stories http://www.101zenstories.com/index.php?story=13 :
What if the friend shared the same core values as you? If my friend had the same core value as me (e.g. it is worth killing two people to save a billion people from eternal torture), and were utilitarian, then perhaps I’d be “ok”[1] with my friend making “one utilitarian calculation too many” and killing both of us.
1: By “ok”, I guess I mean I’d probably be very upset during those final moments where I’m dying, and then my consciousness would cease, my final thoughts to be damning my friend. But if I allow myself to imagine an after-life, I could see eventually (weeks after my death? months?) eventually grudgingly coming to accept that his/her choice was probably the rational one, and agreeing that (s)he “did the right thing”.