You shouldn’t take it as an axiom that the SIAI is the most-beneficial charity in the world. You imply that anyone who thinks otherwise is irrational.
...was questioning XiXiDu’s:
If everyone was to take Landsburg’s argument seriously, which would imply that all humans were rational, then everyone would solely donate to the SIAI.
...but it isn’t clear that the SIAI is the best charity in the world!!! They are in an interesting space—but maybe they are attacking the problem all wrong, lacking in the required skills, occupying the niche of better players—or failing in other ways.
XiXiDu justified making this highly-dubious claim by saying he was trying to avoid getting down-voted—and so wrote something which made his post “sound more agreeable”.
SIAI would probably be at least in competition for best charity in the world even if their chance for direct success was zero and their only actual success raising awareness of the problem.
I did a wildly guessing back of the envelope type calculation on that a while ago and even with very conservative estimations of the chance of a negative singularity and completely discounting any effect on the far future as well as any possibility of a positive singularity SIAI scored about 1 saved life per $1000.
Phil’s:
...was questioning XiXiDu’s:
...but it isn’t clear that the SIAI is the best charity in the world!!! They are in an interesting space—but maybe they are attacking the problem all wrong, lacking in the required skills, occupying the niche of better players—or failing in other ways.
XiXiDu justified making this highly-dubious claim by saying he was trying to avoid getting down-voted—and so wrote something which made his post “sound more agreeable”.
SIAI would probably be at least in competition for best charity in the world even if their chance for direct success was zero and their only actual success raising awareness of the problem.
I did a wildly guessing back of the envelope type calculation on that a while ago and even with very conservative estimations of the chance of a negative singularity and completely discounting any effect on the far future as well as any possibility of a positive singularity SIAI scored about 1 saved life per $1000.
Accepting the logical validity of an argument, and flatly denying its soundness, is not an interesting or worthwhile or even good contribution.
What? Where are you suggesting that someone is doing that?
If you are talking about me and your logical argument, that is just not what was being discussed.
The correctness of the axiom concerning charity quality was what was in dispute from the beginning—not any associated logical reasoning.