reduce you 1/100000 figure, esp. if you take only the leaders of the said movement
I already did, there was a huge number of such movements, most of them highly obscure (not unlike Eliezer). I’d expect some power law distribution in prominence, so for every one we’ve heard about there’d be far more we didn’t.
I think that if you accept that AGI is “near”, that FAI is important to try in order to prevent it
I don’t, and the link from AGI to FAI is as weak as from oil production statistics to civilizational collapse peakoilers promised.
The part where development of AGI fooms immediately into superintelligence and destroys the world. Evidence for it in not even circumstantial, it is fictional.
Still, when I imagine something that is smarter than man who created it, it seems it would be able to improve itself.I would bet on that; I do not see a strong reason why this would not happen. What about you? Are you with Hanson on this one?
I already did, there was a huge number of such movements, most of them highly obscure (not unlike Eliezer). I’d expect some power law distribution in prominence, so for every one we’ve heard about there’d be far more we didn’t.
I don’t, and the link from AGI to FAI is as weak as from oil production statistics to civilizational collapse peakoilers promised.
Ok, thinking how close we are to AGI is a prior I do not care to argue about, but don’t you think AGI is a concern? What do you mean by a weak link?
The part where development of AGI fooms immediately into superintelligence and destroys the world. Evidence for it in not even circumstantial, it is fictional.
Ok, of course it’s fictional—hasn’t happened yet!
Still, when I imagine something that is smarter than man who created it, it seems it would be able to improve itself.I would bet on that; I do not see a strong reason why this would not happen. What about you? Are you with Hanson on this one?