So it’s just an awfully convenient coincidence that the charity to donate to best display trial affiliations to lesswrong crowd, and the charity to donate to best save the world just happens to be the same one? What a one in a billion chance!
No, that’s not it at all. If, as people here like to believe (and may or may not be true), the LWers are very rational and good at picking things that have very high expected value as things to start or donate to, then it makes sense that one of them (Eliezer) would create an organization that would have a very high expected value to have exist (SIAI) and the rest of the people here would donate to it. If that is the case, that SIAI is the best charity to donate to in terms of expected value (which it may or may not be), then it would also be the best charity to best donate to in order to display tribal affiliations (which it definitely is). So if you accept that people on LW are more rational than average, then them donating so much to SIAI should be taken as weak evidence that SIAI is a really good charity to donate to.
you can make some real difference by supporting asteroid tracking programs.
I was under the impression that those already had sufficient resources? Could you link to some more information on this subject, please? I agree that asteroids are a more obviously important issue than the Singularity.
If, as people here like to believe (and may or may not be true), the LWers are very rational and good at picking things that have very high expected value as things to start or donate to [...]
I didn’t downvote you, but what you’re saying is essentially “if you accept our tribe is the most awesome and smartest, then it makes sense to donate to our tribal charity”. Which is something every single group would say, in slight variation.
I was under the impression that those already had sufficient resources? Could you link to some more information on this subject, please? I agree that asteroids are a more obviously important issue than the Singularity.
Here’s results chart for various asteroid tracking efforts. Catalina Sky Survey seems to be doing most of the work these days, and you can probably donate to University of Arizona and have that money go to CSS somehow. I’m not really following this too closely, I’m mostly glad that some people are doing something here.
but what you’re saying is essentially “if you accept our tribe is the most awesome and smartest, then it makes sense to donate to our tribal charity”. Which is something every single group would say, in slight variation.
Well yeah; that’s why you should examine the evidence and not just do what everyone else does. So let’s look at the beliefs of all the Singularitarians on LW as evidence. What would we expect to see if LW is just an arbitrary tribe that picked a random cause to glom around? I suspect we would see that not many people in the world, and particularly not high-status people and organizations, would pay attention to the Singularity. I predict that everyone on LW would donate money to SIAI and shun people who don’t donate or belittle SIAI.
Now what would we see if LW is in fact a group of high-quality rationalists and the world, in general, is too blinded by various biases to think rationally about low-probability, high-impact events? Well, most people, including high-status people (but perhaps not some academics) wouldn’t talk about it. People on LW would donate money to SIAI because they did the calculation and decided it was the highest expected value. And they would probably shun the people who disagree, because they’re still humans.
Those two situations look awfully similar to me. My point is, I certainly don’t think that you can use LW’s enthusiasm about SIAI compared to the general public as a strike against LW or SIAI.
Here’s results chart for various asteroid tracking efforts. Catalina Sky Survey seems to be doing most of the work these days, and you can probably donate to University of Arizona and have that money go to CSS somehow. I’m not really following this too closely, I’m mostly glad that some people are doing something here.
I’m not finding anything there indicating that they’re hurting for funding, but perhaps I’m missing it.
I honestly believe that the Singularity is a greater threat then asteroids to the human race. Either an asteroid will be small enough that we can destroy it or its too big to stop. Once you make an asteroid big enough to cause risk to humanity its also a lot easier to find and destroy. However, a positive singularity isn’t valued enough and a negative singularity isn’t feared enough among humanity unlike asteroid deflection efforts and that’s why i focus on SIAI.
You actually need to detect these asteroids decades in advance for our current technology to stand any chance, and we currently don’t do that. More detection efforts mean tracking smaller asteroids than otherwise, but more importantly tracking big asteroids faster.
Arbitrarily massive asteroid can be moved off course very easily given enough time to do so. That’s the plan, not “destroying” them.
Still, considering there’s a very low chance of a large asteroid strike and most the most quoted figure Ive heard is that we have more than 75% of NEO objects that are of dangerous size being tracked. I think a negative singularity is more likely to happen in the next 200 years then an asteroid strike.
However, it is a good point that donating money to NEO tracking could be a good charitable donation as well i just don’t think its on the same order of magnitude as the danger of a uFAI.
With asteroid strike everybody agrees on risk within order of magnitude or two. We have a lot of historical data about asteroid strikes of various sizes, can use power level distribution to smooth it a bit etc.
With UFAI people’s estimate are about as divergent as with Second Coming of Jesus Christ, ranging from impossible even in theory through essentially impossible all the way to almost certain.
No, that’s not it at all. If, as people here like to believe (and may or may not be true), the LWers are very rational and good at picking things that have very high expected value as things to start or donate to, then it makes sense that one of them (Eliezer) would create an organization that would have a very high expected value to have exist (SIAI) and the rest of the people here would donate to it. If that is the case, that SIAI is the best charity to donate to in terms of expected value (which it may or may not be), then it would also be the best charity to best donate to in order to display tribal affiliations (which it definitely is). So if you accept that people on LW are more rational than average, then them donating so much to SIAI should be taken as weak evidence that SIAI is a really good charity to donate to.
I was under the impression that those already had sufficient resources? Could you link to some more information on this subject, please? I agree that asteroids are a more obviously important issue than the Singularity.
I didn’t downvote you, but what you’re saying is essentially “if you accept our tribe is the most awesome and smartest, then it makes sense to donate to our tribal charity”. Which is something every single group would say, in slight variation.
Here’s results chart for various asteroid tracking efforts. Catalina Sky Survey seems to be doing most of the work these days, and you can probably donate to University of Arizona and have that money go to CSS somehow. I’m not really following this too closely, I’m mostly glad that some people are doing something here.
Thanks! I upvoted you.
Well yeah; that’s why you should examine the evidence and not just do what everyone else does. So let’s look at the beliefs of all the Singularitarians on LW as evidence. What would we expect to see if LW is just an arbitrary tribe that picked a random cause to glom around? I suspect we would see that not many people in the world, and particularly not high-status people and organizations, would pay attention to the Singularity. I predict that everyone on LW would donate money to SIAI and shun people who don’t donate or belittle SIAI.
Now what would we see if LW is in fact a group of high-quality rationalists and the world, in general, is too blinded by various biases to think rationally about low-probability, high-impact events? Well, most people, including high-status people (but perhaps not some academics) wouldn’t talk about it. People on LW would donate money to SIAI because they did the calculation and decided it was the highest expected value. And they would probably shun the people who disagree, because they’re still humans.
Those two situations look awfully similar to me. My point is, I certainly don’t think that you can use LW’s enthusiasm about SIAI compared to the general public as a strike against LW or SIAI.
I’m not finding anything there indicating that they’re hurting for funding, but perhaps I’m missing it.
I honestly believe that the Singularity is a greater threat then asteroids to the human race. Either an asteroid will be small enough that we can destroy it or its too big to stop. Once you make an asteroid big enough to cause risk to humanity its also a lot easier to find and destroy. However, a positive singularity isn’t valued enough and a negative singularity isn’t feared enough among humanity unlike asteroid deflection efforts and that’s why i focus on SIAI.
You actually need to detect these asteroids decades in advance for our current technology to stand any chance, and we currently don’t do that. More detection efforts mean tracking smaller asteroids than otherwise, but more importantly tracking big asteroids faster.
Arbitrarily massive asteroid can be moved off course very easily given enough time to do so. That’s the plan, not “destroying” them.
Still, considering there’s a very low chance of a large asteroid strike and most the most quoted figure Ive heard is that we have more than 75% of NEO objects that are of dangerous size being tracked. I think a negative singularity is more likely to happen in the next 200 years then an asteroid strike. However, it is a good point that donating money to NEO tracking could be a good charitable donation as well i just don’t think its on the same order of magnitude as the danger of a uFAI.
With asteroid strike everybody agrees on risk within order of magnitude or two. We have a lot of historical data about asteroid strikes of various sizes, can use power level distribution to smooth it a bit etc.
With UFAI people’s estimate are about as divergent as with Second Coming of Jesus Christ, ranging from impossible even in theory through essentially impossible all the way to almost certain.
Money spent on mind uploading is a better defense against asteroids than asteroid detection. At least for me.