Imagine Omega said, “The person behind you will live for 30 seconds if you don’t kill her. If you kill her, you will continue leading a long and healthy life. If you don’t, you will also die in 30 seconds.”
Do you say the same thing to Omega and continue enjoying your 30 seconds of life?
It’s not suicide, I’m just bumping into a moral absolute—I won’t murder under those circumstances, so outcomes conditional on “I commit murder” are pruned from the search tree. If the only remaining outcome is “I die”, then drat.
I didn’t think it through for any kind of logical consistency—it’s pure snap judgment. I think my instinct when presented with this kind of ethical dilemma is treat my own qalys (well, qalsecs) as far less valuable than those of another person. Or possibly I’m just paying an emotional surcharge for actually taking the action of ending another person’s life. There was some sense of “having enough time to do something of import (e.g., call loved ones)” in there too.
I think my reaction would be “fuck you Omega”. If an omniscient entity decides to be this much of a douchebag then dying giving them the finger seems the only decent thing to do.
This question is a highly exaggerated example to display the incentives, but cryonics subscribers will be facing choices of this kind, with much more subtle probabilities and payoffs.
Imagine Omega said, “The person behind you will live for 30 seconds if you don’t kill her. If you kill her, you will continue leading a long and healthy life. If you don’t, you will also die in 30 seconds.”
Do you say the same thing to Omega and continue enjoying your 30 seconds of life?
No difference. I won’t buy my life with murder at any price. (Weighing one-against-many innocents is a different problem.)
And I’d be calling Omega a bastard because, as an excellent predictor, he’d know that, but decided to ruin my day by telling me anyway.
Can you explain, then, how this is different then suicide, since your theft of her life is minimal, yet your sacrifice of your own life is large?
It’s not suicide, I’m just bumping into a moral absolute—I won’t murder under those circumstances, so outcomes conditional on “I commit murder” are pruned from the search tree. If the only remaining outcome is “I die”, then drat.
For 30 seconds, I kill her. For an hour, we both die. I think my indecision point is around 15 minutes.
...thank you for your honest self-reporting, but I do feel obliged to point out that this does not make any sense.
I didn’t think it through for any kind of logical consistency—it’s pure snap judgment. I think my instinct when presented with this kind of ethical dilemma is treat my own qalys (well, qalsecs) as far less valuable than those of another person. Or possibly I’m just paying an emotional surcharge for actually taking the action of ending another person’s life. There was some sense of “having enough time to do something of import (e.g., call loved ones)” in there too.
But isn’t this time relative to lifespan? What if your entire lifespan were only 30 minutes?
I think my reaction would be “fuck you Omega”. If an omniscient entity decides to be this much of a douchebag then dying giving them the finger seems the only decent thing to do.
My implied assumption was Omega was an excellent predictor, not an actor—I thought this was a standard assumption, but maybe it isn’t.
Showing that the original question had little to do with cryonics...
This question is a highly exaggerated example to display the incentives, but cryonics subscribers will be facing choices of this kind, with much more subtle probabilities and payoffs.