Oh. This seems unnecessarily treading over previously covered ground. My short answer is “no”.
My long answer would probably be some sort of formalization of “no, but I understand why they’d do it”. I’d be happy with the cognitive algorithm that would make the other person flip the switch. But my feeling is that when you do the calculations, and the calculations say I should die, then demanding I should die is one thing… demanding I be happy about it is asking a bit much.
I’m confused. If I’m not the one flipping the switch, what’s the question you’re asking?
Would you still want them to flip the switch, even though it would result in your death.
Oh. This seems unnecessarily treading over previously covered ground. My short answer is “no”.
My long answer would probably be some sort of formalization of “no, but I understand why they’d do it”. I’d be happy with the cognitive algorithm that would make the other person flip the switch. But my feeling is that when you do the calculations, and the calculations say I should die, then demanding I should die is one thing… demanding I be happy about it is asking a bit much.