If I knew someone was capable of this, I wouldn’t want them as a friend or partner. Who knows when they might make one utilitarian calculation too many and kill us both?
What if the friend shared the same core values as you? If my friend had the same core value as me (e.g. it is worth killing two people to save a billion people from eternal torture), and were utilitarian, then perhaps I’d be “ok”[1] with my friend making “one utilitarian calculation too many” and killing both of us.
1: By “ok”, I guess I mean I’d probably be very upset during those final moments where I’m dying, and then my consciousness would cease, my final thoughts to be damning my friend. But if I allow myself to imagine an after-life, I could see eventually (weeks after my death? months?) eventually grudgingly coming to accept that his/her choice was probably the rational one, and agreeing that (s)he “did the right thing”.
What if the friend shared the same core values as you? If my friend had the same core value as me (e.g. it is worth killing two people to save a billion people from eternal torture), and were utilitarian, then perhaps I’d be “ok”[1] with my friend making “one utilitarian calculation too many” and killing both of us.
1: By “ok”, I guess I mean I’d probably be very upset during those final moments where I’m dying, and then my consciousness would cease, my final thoughts to be damning my friend. But if I allow myself to imagine an after-life, I could see eventually (weeks after my death? months?) eventually grudgingly coming to accept that his/her choice was probably the rational one, and agreeing that (s)he “did the right thing”.