I don’t want to become a “cleaning up this world”-bot. I have my own goals and aims in life, and they are distinct from the goal of “producing as much positive utility for humanity” as possible. I’d rather spend £99 out of every £100 on myself than give it to a random poor person in the third world, because I am more important than s/he is (more important in the subjective, antirealist sense). If anyone here really is a totally dedicated altruist, (in the sense of weighing the welfare of the other 6*10^9 people on the planet equally to your own) then I pity you, but I’m glad you exist.
In general, this problem is not soluble, i.e. you can’t get a pound worth of altruism from a penny worth of desire to help strangers, at least without the kind of mind-control strategies that religion employs. But we’ve already decided we don’t want to do that.
However, in the special case of accelerating technology and the singularity, the problem is soluble, because even 1% of the optimizing ability of an FAI is enough to lift the third world from poverty to paradise.
Apologies for going off topic—but I couldn’t really avoid it…
I don’t want to become a “cleaning up this world”-bot. I have my own goals and aims in life, and they are distinct from the goal of “producing as much positive utility for humanity” as possible. I’d rather spend £99 out of every £100 on myself than give it to a random poor person in the third world, because I am more important than s/he is (more important in the subjective, antirealist sense).
Hey, that’s fine. You certainly don’t have to try to justify your basic utility function. But for people who want to do more to help the rest of the world (even if we prioritize ourselves first), it can be hard just to get ourselves to act rationally in pursuit of this goal. That’s the issue at hand.
Right, so everyone has their own level, their own line of “this much is for me, this much is for everyone else”, and each of us has to make that choice and live with it. My choice is biased towards (a) my culture (b) my friends and family and (c) myself.
Other people may put “the rest of the world” higher than me, though I expect that most people in this country and the US put it lower.
It would seem that Greene has deconverted you away from objective morality along different lines than I was trying for myself.
Anyway, your comment suggests that FAI should take its funding primarily from the most selfish of rationalists who still have a trace of altruism in them, since FAI would be the only project where expected utilons can be purchased so cheaply as to move them; and leave more altruistic funding to more mundane projects.
Now, what are the odds that would work in real life? I would think very low. FAI is likely to actually need those rare folk who can continue supporting without a lot of in-person support and encouragement and immediately visible concrete results, leaving the others to those projects which are more intuitively encouraging to a human brain.
It seems to me that no matter what people claim about their selfishness or altruism, the real line is between those who can bring themselves to do something about it under conditions X and those who can’t—and that the actual payoff in expected utilons matters little, but the reinforcing conditions matter a lot.
It would seem that Greene has deconverted you away from objective morality along different lines than I was trying for myself.
I think that this is because, in the world of antirealism, there is a lot of room for arbitrary disagreement based on unjustifiable personal preferences. I think that you’re a more altruistic person than me, and this shows, but the difference is perhaps smaller than you think. It would be easier to iron this out in person than by text. Most people don’t even give 1% of their wealth to charities… that would be about £500 per year for most people in the UK, most people don’t give anything like that much. Speaking of which, I think I have a donation to make…
It seems to me that no matter what people claim about their selfishness or altruism, the real line is between those who can bring themselves to do something about it under conditions X and those who can’t—and that the actual payoff in expected utilons matters little, but the reinforcing conditions matter a lot.
Yep. I agree completely. If I was surrounded by people who were all giving 50% of their wealth to the third world, I’d probably want to do the same. But we now know that different circumstances bring out different aspects of people
I would think very low. FAI is likely to actually need those rare folk who can continue supporting without a lot of in-person support and encouragement and immediately visible concrete results, leaving the others to those projects which are more intuitively encouraging to a human brain.
I agree with this too—I think that you have to have a very specific self-image to want to work on FAI. The self-image of being either a very good person, or a very important person, or a combination of both. I think that I’m a combination of the two. I do feel a certain amount of duty towards others, especially those who are close to me, and the people in this world who I admire. I am more motivated by the thought of the 0.001% of people on the planet who I think are really awesome getting turned into paperclips than the others.
Anyway, your comment suggests that FAI should take its funding primarily from the most selfish of rationalists who still have a trace of altruism in them, since FAI would be the only project where expected utilons can be purchased so cheaply as to move them; and leave more altruistic funding to more mundane projects.
No, I’d say that at the moment all money one has power over should go to FAI, until it is funded at a level of about 0.1% of world GDP, or about a billion a year. That includes the funding of true altruists, and of “trace” altruists who are (say) prepared to donate 1% of their effort to help others. At that point, you’d have to do a more careful analysis which I don’t have time or knowledge for.
I don’t want to become a “cleaning up this world”-bot. I have my own goals and aims in life, and they are distinct from the goal of “producing as much positive utility for humanity” as possible. I’d rather spend £99 out of every £100 on myself than give it to a random poor person in the third world, because I am more important than s/he is (more important in the subjective, antirealist sense). If anyone here really is a totally dedicated altruist, (in the sense of weighing the welfare of the other 6*10^9 people on the planet equally to your own) then I pity you, but I’m glad you exist.
In general, this problem is not soluble, i.e. you can’t get a pound worth of altruism from a penny worth of desire to help strangers, at least without the kind of mind-control strategies that religion employs. But we’ve already decided we don’t want to do that.
However, in the special case of accelerating technology and the singularity, the problem is soluble, because even 1% of the optimizing ability of an FAI is enough to lift the third world from poverty to paradise.
Apologies for going off topic—but I couldn’t really avoid it…
Hey, that’s fine. You certainly don’t have to try to justify your basic utility function. But for people who want to do more to help the rest of the world (even if we prioritize ourselves first), it can be hard just to get ourselves to act rationally in pursuit of this goal. That’s the issue at hand.
Right, so everyone has their own level, their own line of “this much is for me, this much is for everyone else”, and each of us has to make that choice and live with it. My choice is biased towards (a) my culture (b) my friends and family and (c) myself.
Other people may put “the rest of the world” higher than me, though I expect that most people in this country and the US put it lower.
It would seem that Greene has deconverted you away from objective morality along different lines than I was trying for myself.
Anyway, your comment suggests that FAI should take its funding primarily from the most selfish of rationalists who still have a trace of altruism in them, since FAI would be the only project where expected utilons can be purchased so cheaply as to move them; and leave more altruistic funding to more mundane projects.
Now, what are the odds that would work in real life? I would think very low. FAI is likely to actually need those rare folk who can continue supporting without a lot of in-person support and encouragement and immediately visible concrete results, leaving the others to those projects which are more intuitively encouraging to a human brain.
It seems to me that no matter what people claim about their selfishness or altruism, the real line is between those who can bring themselves to do something about it under conditions X and those who can’t—and that the actual payoff in expected utilons matters little, but the reinforcing conditions matter a lot.
But perhaps I am mistaken.
Shhh.
No saying the F-acronym yet.
I think that this is because, in the world of antirealism, there is a lot of room for arbitrary disagreement based on unjustifiable personal preferences. I think that you’re a more altruistic person than me, and this shows, but the difference is perhaps smaller than you think. It would be easier to iron this out in person than by text. Most people don’t even give 1% of their wealth to charities… that would be about £500 per year for most people in the UK, most people don’t give anything like that much. Speaking of which, I think I have a donation to make…
Yep. I agree completely. If I was surrounded by people who were all giving 50% of their wealth to the third world, I’d probably want to do the same. But we now know that different circumstances bring out different aspects of people
I agree with this too—I think that you have to have a very specific self-image to want to work on FAI. The self-image of being either a very good person, or a very important person, or a combination of both. I think that I’m a combination of the two. I do feel a certain amount of duty towards others, especially those who are close to me, and the people in this world who I admire. I am more motivated by the thought of the 0.001% of people on the planet who I think are really awesome getting turned into paperclips than the others.
No, I’d say that at the moment all money one has power over should go to FAI, until it is funded at a level of about 0.1% of world GDP, or about a billion a year. That includes the funding of true altruists, and of “trace” altruists who are (say) prepared to donate 1% of their effort to help others. At that point, you’d have to do a more careful analysis which I don’t have time or knowledge for.