There are various counter-arguments, such as that if there are too few egoists and too many altruists, then then Overton Window will shift to the point that egoism can become socially disapproved of; or that altruism isn’t even necessary for reasonably rational egoists to engage in positive-sum interactions which are nearly indistinguishable from altruistic behaviour, as has been explored in some depth by libertarian philosophers; or that any one egoist is unlikely to be able to persuade any significant number of altruists to become egoists, so the optimal egoist approach is more likely to focus attention on one’s own actions rather than persuading others to become egoists; and so on.
I guess. I feel if your egoism is more complicated than “Do whatever you want.”, then you’ve kind of lost sight of the main thing. But obviously this comment is vulnerable to the same objection, so I guess I’ll just close by saying that calling egoism where you end up caring about Overton Windows “effective egoism” seems pretty exactly wrong. There’s a whole Fake Selfishness link on LW, yeah? That seems like what this is.
I /want/ to go camping on Phobos. There are certain practical problems in accomplishing that. Likewise, there are a great many practical problems in accomplishing many other more ordinary things that I want to do; but some of those problems are soluble, depending on the resources I choose to throw at them, but with only a finite amount of resources, I have to make choices about /which/ of my wants to try to full.
Fake Selfishness
choose between dying immediately to save the Earth, or living in comfort for one more year and then dying along with Earth.
I’m aiming for not dying at all. (At least, not permanently.) Which leads, in this case, to not considering there to be much difference between having a few more seconds of life compared to one year of life, if those are the only two options; and as long as humanity survives, then there’s a small but reasonable chance of an approximation of my mind being reconstructed, which, while not as good a choice as a full continuation of all my memories, is still better than nothing. So I would selfishly choose to save the Earth.
On the other hand, if I consider the original question...
“Would you sacrifice your own life to save the entire human species?”
… without assuming that I’m a member of humanity doomed to die anyway, such as if I’m an upload; I’m currently drafting a novel in which the consideration of precisely that question is a significant plot point, and it is not a One-Sided Policy Debate.
“If you had to choose one event or the other, would you rather that you stubbed your toe, or that the stranger standing near the wall there gets horribly tortured for fifty years?”
If I live in a world where someone in physical proximity to me is likely to be horribly tortured for fifty years, then I very likely live in a world where /I/ have a moderately high chance of being horribly tortured for fifty years. If I balance the odds, then a certainty of minor pain from a stubbed toe seems a small price to pay to not live in a world with even a moderate chance of me experiencing fifty years of horrible torture.
“Would you steal a thousand dollars from Bill Gates if you could be guaranteed that neither he nor anyone else would ever find out about it?”
Mu; I do not think that such a guarantee is feasible.
There are various counter-arguments, such as that if there are too few egoists and too many altruists, then then Overton Window will shift to the point that egoism can become socially disapproved of; or that altruism isn’t even necessary for reasonably rational egoists to engage in positive-sum interactions which are nearly indistinguishable from altruistic behaviour, as has been explored in some depth by libertarian philosophers; or that any one egoist is unlikely to be able to persuade any significant number of altruists to become egoists, so the optimal egoist approach is more likely to focus attention on one’s own actions rather than persuading others to become egoists; and so on.
I guess. I feel if your egoism is more complicated than “Do whatever you want.”, then you’ve kind of lost sight of the main thing. But obviously this comment is vulnerable to the same objection, so I guess I’ll just close by saying that calling egoism where you end up caring about Overton Windows “effective egoism” seems pretty exactly wrong. There’s a whole Fake Selfishness link on LW, yeah? That seems like what this is.
I /want/ to go camping on Phobos. There are certain practical problems in accomplishing that. Likewise, there are a great many practical problems in accomplishing many other more ordinary things that I want to do; but some of those problems are soluble, depending on the resources I choose to throw at them, but with only a finite amount of resources, I have to make choices about /which/ of my wants to try to full.
I’m aiming for not dying at all. (At least, not permanently.) Which leads, in this case, to not considering there to be much difference between having a few more seconds of life compared to one year of life, if those are the only two options; and as long as humanity survives, then there’s a small but reasonable chance of an approximation of my mind being reconstructed, which, while not as good a choice as a full continuation of all my memories, is still better than nothing. So I would selfishly choose to save the Earth.
On the other hand, if I consider the original question...
… without assuming that I’m a member of humanity doomed to die anyway, such as if I’m an upload; I’m currently drafting a novel in which the consideration of precisely that question is a significant plot point, and it is not a One-Sided Policy Debate.
If I live in a world where someone in physical proximity to me is likely to be horribly tortured for fifty years, then I very likely live in a world where /I/ have a moderately high chance of being horribly tortured for fifty years. If I balance the odds, then a certainty of minor pain from a stubbed toe seems a small price to pay to not live in a world with even a moderate chance of me experiencing fifty years of horrible torture.
Mu; I do not think that such a guarantee is feasible.