Most people very seriously studying AI probably have high SATs too. High IQs. High lots of things. And some likely have other unique qualities and advantages that Eliezer doesn’t.
Unique in some qualities doesn’t mean uniquely capable of the task in some timeline.
My main objection is that until it’s done, I don’t think people are very justified in claims to know what it will take to get done, and therefore unjustified in claiming some particular person is best able to do it, even if he is best suited to pursue one particular approach to the problem.
Hence, I conclude he is overestimating his importance, per the definition. Not that I see it as some heinous crime. He’s over confident. So what? It seems to be an ingredient to high achievement. Better to be over confident epistemologically than under confident instrumentally.
I’d say that’s, at the very least, an oversimplification; when you look at the architecture of organizations generally recognized as cults, you end up finding they share a fairly specific cluster of cultural characteristics, one that has more to do with internal organization than claims of certainty. My favorite framework for this is the amusingly named ABCDEF: though aimed at new religions in the neopagan space, it’s general enough to be applied outside it.
Most people very seriously studying AI probably have high SATs too. High IQs. High lots of things. And some likely have other unique qualities and advantages that Eliezer doesn’t.
Unique in some qualities doesn’t mean uniquely capable of the task in some timeline.
My main objection is that until it’s done, I don’t think people are very justified in claims to know what it will take to get done, and therefore unjustified in claiming some particular person is best able to do it, even if he is best suited to pursue one particular approach to the problem.
Hence, I conclude he is overestimating his importance, per the definition. Not that I see it as some heinous crime. He’s over confident. So what? It seems to be an ingredient to high achievement. Better to be over confident epistemologically than under confident instrumentally.
Private overconfidence is harmless. Public overconfidence is how cults start.
I’d say that’s, at the very least, an oversimplification; when you look at the architecture of organizations generally recognized as cults, you end up finding they share a fairly specific cluster of cultural characteristics, one that has more to do with internal organization than claims of certainty. My favorite framework for this is the amusingly named ABCDEF: though aimed at new religions in the neopagan space, it’s general enough to be applied outside it.
(Eliezer, of course, would say that every cause wants to be a cult. I think he’s being too free with the word, myself.)