I think that I’d easily accept a year of torture in order to produce ten planets worth of thriving civilizations. (Or, if I lack the resolve to follow through on a sacrifice like that, I still think I’d have the resolve to take a pill that causes me to have this resolve.)
I think that “resolve” is often a lie we tell ourselves to explain the discrepancies between stated and revealed preferences. I concede that if you took that pill, it would be evidence against my position (but, I believe you probably would not).
A nuance to keep in mind is that reciprocity can be a rational motivation to behave more altruistically that you otherwise would. This can come about from tit-for-tat / reputation systems, or even from some kind of acausal cooperation. Reciprocity is effectively moving us closer to utilitarianism, but certainly not all the way there.
So, if I’m weighing the life of my son or daughter against an intergalatic network of civilizations, which I never heard of before and never going to hear about after, and which wouldn’t even reciprocate in a symmetric scenario, I’m choosing my child for sure.
If I knew as a certainty that I cannot do nearly as much good some other way, and I was certain that taking the pill causes that much good, I’d take the pill, even if I die after the torture and no one will know I sacrificed myself for others.
I admit those are quite unusual values for a human, and I’m not arguing about that it would be rational because of utilitarianism or so, just that I would do it. (Possible that I’m wrong, but I think very likely I’m not.) Also, I see that the way my brain is wired outer optimization pushes against that policy, and I think I probably wouldn’t be able to take the pill a second time under the same conditions (given that I don’t die after torture), or at least not often.
I don’t think those are unusual values for a human. Many humans have sacrificed their lives (and endured great hardship and pain, etc.) to help others. And many more would take a pill to gain that quality, seeing it as a more courageous and high-integrity expression of their values.
I think that “resolve” is often a lie we tell ourselves to explain the discrepancies between stated and revealed preferences. I concede that if you took that pill, it would be evidence against my position (but, I believe you probably would not).
A nuance to keep in mind is that reciprocity can be a rational motivation to behave more altruistically that you otherwise would. This can come about from tit-for-tat / reputation systems, or even from some kind of acausal cooperation. Reciprocity is effectively moving us closer to utilitarianism, but certainly not all the way there.
So, if I’m weighing the life of my son or daughter against an intergalatic network of civilizations, which I never heard of before and never going to hear about after, and which wouldn’t even reciprocate in a symmetric scenario, I’m choosing my child for sure.
If I knew as a certainty that I cannot do nearly as much good some other way, and I was certain that taking the pill causes that much good, I’d take the pill, even if I die after the torture and no one will know I sacrificed myself for others.
I admit those are quite unusual values for a human, and I’m not arguing about that it would be rational because of utilitarianism or so, just that I would do it. (Possible that I’m wrong, but I think very likely I’m not.) Also, I see that the way my brain is wired outer optimization pushes against that policy, and I think I probably wouldn’t be able to take the pill a second time under the same conditions (given that I don’t die after torture), or at least not often.
I don’t think those are unusual values for a human. Many humans have sacrificed their lives (and endured great hardship and pain, etc.) to help others. And many more would take a pill to gain that quality, seeing it as a more courageous and high-integrity expression of their values.