At this point, I figure you have earned the right to say more-or-less whatever you like, for quite a while, without bothering too much about keeping score.
When I’m reading comments, I often skip over the ones that have low or negative score. I imagine other people do the same thing. So if you think your point is important enough to be read by more than a few people, you do want to try to have it voted up (but of course you shouldn’t significantly compromise your other values/interests to do so).
I looked at the context, but it seemed to me that Xi was just being sloppy. (Of course Landsburg’s argument implies rational agents should donate solely to SIAI, if SIAI offers the greatest marginal return. A~>B, A, Q.E.D., B.)
If Xi is being sloppy or stupid, then he should pay attention to what his karma is saying. That’s what it’s for! If you want to burn karma, it ought to be for something difficult that you’re very sure about, where the community is wrong and you’re right.
You shouldn’t take it as an axiom that the SIAI is the most-beneficial charity in the world. You imply that anyone who thinks otherwise is irrational.
...was questioning XiXiDu’s:
If everyone was to take Landsburg’s argument seriously, which would imply that all humans were rational, then everyone would solely donate to the SIAI.
...but it isn’t clear that the SIAI is the best charity in the world!!! They are in an interesting space—but maybe they are attacking the problem all wrong, lacking in the required skills, occupying the niche of better players—or failing in other ways.
XiXiDu justified making this highly-dubious claim by saying he was trying to avoid getting down-voted—and so wrote something which made his post “sound more agreeable”.
SIAI would probably be at least in competition for best charity in the world even if their chance for direct success was zero and their only actual success raising awareness of the problem.
I did a wildly guessing back of the envelope type calculation on that a while ago and even with very conservative estimations of the chance of a negative singularity and completely discounting any effect on the far future as well as any possibility of a positive singularity SIAI scored about 1 saved life per $1000.
You do have over 2000 karma.
At this point, I figure you have earned the right to say more-or-less whatever you like, for quite a while, without bothering too much about keeping score.
When I’m reading comments, I often skip over the ones that have low or negative score. I imagine other people do the same thing. So if you think your point is important enough to be read by more than a few people, you do want to try to have it voted up (but of course you shouldn’t significantly compromise your other values/interests to do so).
I’m curious why Tim’s comment got downvoted 3 times.
Karma isn’t a license to act like a dick, make bad arguments, be sloppy, or commit sins of laziness.
/checks karma; ~3469, good.
Which should be obvious, you purblind bescumbered fen-sucked measle.
Right—but the context was “the Karma system and general attitude here makes me dishonest”.
If you are not short of Karma, sugar-coating for the audience at the expense of the truth seems to be largely unnecessary.
I looked at the context, but it seemed to me that Xi was just being sloppy. (Of course Landsburg’s argument implies rational agents should donate solely to SIAI, if SIAI offers the greatest marginal return. A~>B, A, Q.E.D., B.)
If Xi is being sloppy or stupid, then he should pay attention to what his karma is saying. That’s what it’s for! If you want to burn karma, it ought to be for something difficult that you’re very sure about, where the community is wrong and you’re right.
Phil’s:
...was questioning XiXiDu’s:
...but it isn’t clear that the SIAI is the best charity in the world!!! They are in an interesting space—but maybe they are attacking the problem all wrong, lacking in the required skills, occupying the niche of better players—or failing in other ways.
XiXiDu justified making this highly-dubious claim by saying he was trying to avoid getting down-voted—and so wrote something which made his post “sound more agreeable”.
SIAI would probably be at least in competition for best charity in the world even if their chance for direct success was zero and their only actual success raising awareness of the problem.
I did a wildly guessing back of the envelope type calculation on that a while ago and even with very conservative estimations of the chance of a negative singularity and completely discounting any effect on the far future as well as any possibility of a positive singularity SIAI scored about 1 saved life per $1000.
Accepting the logical validity of an argument, and flatly denying its soundness, is not an interesting or worthwhile or even good contribution.
What? Where are you suggesting that someone is doing that?
If you are talking about me and your logical argument, that is just not what was being discussed.
The correctness of the axiom concerning charity quality was what was in dispute from the beginning—not any associated logical reasoning.