Oh, ’ello. Glad to see somebody still remembers the proximity argument. But it’s adapted to our world where you generally cannot kill a million distant people to make one close relative happy. If we move to a world where Omegas regularly ask people difficult questions, a lot of people adopting proximity reasoning will cause a huge tragedy of the commons.
About Eliezer’s question, I’d exchange my life for a reliable 0.001 chance of healing reality, because I can’t imagine living meaningfully after being offered such a wager and refusing it. Can’t imagine how I’d look other LW users in the eye, that’s for sure.
Can’t imagine how I’d look other LW users in the eye, that’s for sure.
I publicly rejected the offer, and don’t feel like a pariah here. I wonder what is the actual degree of altruism among LW users. Should we set up a poll and gather some evidence?
Cooperation is a different consideration from preference. You can prefer only to keep your own “body” in certain dynamics, no matter what happens to the rest of the world, and still benefit the most from, roughly speaking, helping other agents. Which can include occasional self-sacrifice a la counterfactual mugging.
Oh, ’ello. Glad to see somebody still remembers the proximity argument. But it’s adapted to our world where you generally cannot kill a million distant people to make one close relative happy. If we move to a world where Omegas regularly ask people difficult questions, a lot of people adopting proximity reasoning will cause a huge tragedy of the commons.
About Eliezer’s question, I’d exchange my life for a reliable 0.001 chance of healing reality, because I can’t imagine living meaningfully after being offered such a wager and refusing it. Can’t imagine how I’d look other LW users in the eye, that’s for sure.
I publicly rejected the offer, and don’t feel like a pariah here. I wonder what is the actual degree of altruism among LW users. Should we set up a poll and gather some evidence?
Cooperation is a different consideration from preference. You can prefer only to keep your own “body” in certain dynamics, no matter what happens to the rest of the world, and still benefit the most from, roughly speaking, helping other agents. Which can include occasional self-sacrifice a la counterfactual mugging.