Try peak oil/anti-nuclear/global warming/etc. activists then? They tend to claim their movement saves the world, not themselves personally, but I’m sure I could find sufficient number of them who also had some personality cult thrown in.
Sure, but that would 1) reduce you 1/100000 figure, esp. if you take only the leaders of the said movement. And I would not find claims of saving the world by anti-nuke scientists in say the 1960s preposterous.
I think that if you accept that AGI is “near”, that FAI is important to try in order to prevent it, and that EY was at the very least the person who brought spotlight to the problem (which is a fact), you can end up thinking that he might actually make a difference.
As the comments discuss, that was not an extinction event, barring further burdensome assumptions about nuclear winter or positive feedbacks of social collapse.
No, the Permanent Mission of the Russian Federation to the United Nations disagrees with this story, and Wikipedia quotes that disagreement. The very next section explains why that disagreement may be incorrect.
Do you have any candidates in mind, or some plausible scenario how the world might have been saved by a single person without achieving due prominence?
reduce you 1/100000 figure, esp. if you take only the leaders of the said movement
I already did, there was a huge number of such movements, most of them highly obscure (not unlike Eliezer). I’d expect some power law distribution in prominence, so for every one we’ve heard about there’d be far more we didn’t.
I think that if you accept that AGI is “near”, that FAI is important to try in order to prevent it
I don’t, and the link from AGI to FAI is as weak as from oil production statistics to civilizational collapse peakoilers promised.
The part where development of AGI fooms immediately into superintelligence and destroys the world. Evidence for it in not even circumstantial, it is fictional.
Still, when I imagine something that is smarter than man who created it, it seems it would be able to improve itself.I would bet on that; I do not see a strong reason why this would not happen. What about you? Are you with Hanson on this one?
Try peak oil/anti-nuclear/global warming/etc. activists then? They tend to claim their movement saves the world, not themselves personally, but I’m sure I could find sufficient number of them who also had some personality cult thrown in.
Sure, but that would 1) reduce you 1/100000 figure, esp. if you take only the leaders of the said movement. And I would not find claims of saving the world by anti-nuke scientists in say the 1960s preposterous.
I think that if you accept that AGI is “near”, that FAI is important to try in order to prevent it, and that EY was at the very least the person who brought spotlight to the problem (which is a fact), you can end up thinking that he might actually make a difference.
Yeah, I’m tickled by the estimate that so far 0 people have saved the world. How do we know that? The world is still here, after all.
Eliezer has already placed a Go stone on that intersection, it turns out.
As the comments discuss, that was not an extinction event, barring further burdensome assumptions about nuclear winter or positive feedbacks of social collapse.
In any case Wikipedia disagrees with this story.
No, the Permanent Mission of the Russian Federation to the United Nations disagrees with this story, and Wikipedia quotes that disagreement. The very next section explains why that disagreement may be incorrect.
Do you have any candidates in mind, or some plausible scenario how the world might have been saved by a single person without achieving due prominence?
I already did, there was a huge number of such movements, most of them highly obscure (not unlike Eliezer). I’d expect some power law distribution in prominence, so for every one we’ve heard about there’d be far more we didn’t.
I don’t, and the link from AGI to FAI is as weak as from oil production statistics to civilizational collapse peakoilers promised.
Ok, thinking how close we are to AGI is a prior I do not care to argue about, but don’t you think AGI is a concern? What do you mean by a weak link?
The part where development of AGI fooms immediately into superintelligence and destroys the world. Evidence for it in not even circumstantial, it is fictional.
Ok, of course it’s fictional—hasn’t happened yet!
Still, when I imagine something that is smarter than man who created it, it seems it would be able to improve itself.I would bet on that; I do not see a strong reason why this would not happen. What about you? Are you with Hanson on this one?