I speculate that Yudkowsky has narcissistic tendencies. Call it armchair psychoanalysis if you like, but I think there is enough evidence to warrant such speculations.
I call it an ignoble personal attack which has no place on this forum.
I’m really impressed by Facebook’s lovely user experience—when I get a troll comment I just click the x, block the user and it’s gone without a trace and never recurs.
And regarding narcissism, the definition is: “an inflated sense of one’s own importance and a deep need for admiration.”
See e.g. this conversation between Ben Goertzel and Eliezer Yudkowsky (note that MIRI was formerly known as SIAI):
Striving toward total rationality and total altruism comes easily to me. […] I’ll try not to be an arrogant bastard, but I’m definitely arrogant. I’m incredibly brilliant and yes, I’m proud of it, and what’s more, I enjoy showing off and bragging about it. I don’t know if that’s who I aspire to be, but it’s surely who I am. I don’t demand that everyone acknowledge my incredible brilliance, but I’m not going to cut against the grain of my nature, either. The next time someone incredulously asks, “You think you’re so smart, huh?” I’m going to answer, “Hell yes, and I am pursuing a task appropriate to my talents.” If anyone thinks that a Friendly AI can be created by a moderately bright researcher, they have rocks in their head. This is a job for what I can only call Eliezer-class intelligence.
Unfortunately for my peace of mind and ego, people who say to me “You’re the brightest person I know” are noticeably more common than people who say to me “You’re the brightest person I know, and I know John Conway”. Maybe someday I’ll hit that level. Maybe not.
Until then… I do thank you, because when people tell me that sort of thing, it gives me the courage to keep going and keep trying to reach that higher level.
When Marcello Herreshoff had known me for long enough, I asked him if he knew of anyone who struck him as substantially more natively intelligent than myself. Marcello thought for a moment and said “John Conway—I met him at a summer math camp.” Darn, I thought, he thought of someone, and worse, it’s some ultra-famous old guy I can’t grab. I inquired how Marcello had arrived at the judgment. Marcello said, “He just struck me as having a tremendous amount of mental horsepower,” and started to explain a math problem he’d had a chance to work on with Conway.
Not what I wanted to hear.
And this kind of attitude started early. See for example what he wrote in his early “biography”:
I think my efforts could spell the difference between life and death for most of humanity, or even the difference between a Singularity and a lifeless, sterilized planet [...] I think that I can save the world, not just because I’m the one who happens to be making the effort, but because I’m the only one who can make the effort.
So if I got hit by a meteor right now, what would happen is that Michael Vassar would take over responsibility for seeing the planet through to safety, and say ‘Yeah I’m personally just going to get this done, not going to rely on anyone else to do it for me, this is my problem, I have to handle it.’ And Marcello Herreshoff would be the one who would be tasked with recognizing another Eliezer Yudkowsky if one showed up and could take over the project, but at present I don’t know of any other person who could do that, or I’d be working with them.
regarding narcissism, the definition is: “an inflated sense of one’s own importance and a deep need for admiration.”
That’s the dictionary definition. When throwing around accusations of mental pathology, though, it behooves one not to rely on pattern-matching to one-sentence definitions; it overestimates the prevalence of problems, suggests the wrong approaches to them, and tends to be considered rude.
Having a lot of ambition and an overly optimistic view of intelligence in general and one’s own intelligence in particular doesn’t make you a narcissist, or every fifteen-year-old nerd in the world would be a narcissist.
(That said, I’m not too impressed with Eliezer’s reasons for moving to Facebook.)
I am not an expert on narcissism (though I could be expert at it, heh), but seems to me that a typical narcissistic person would feel they deserve admiration without doing anything awesome. They probably wouldn’t be able to work hard, for years. (But as I said, I am not an expert; there could be multiple types of narcissism.)
I think that I can save the world, not just because I’m the one who happens to be making the effort, but because I’m the only one who can make the effort.
Thinking that one person is going to save the world, and you’re him, qualifies as “an inflated sense of one’s own importance”, IMO.
First mistake: believing that one person will be saving the world. Second mistake: there is likely only one person that can do it, and he’s that person.
“You think that you are potentially the greatest who has yet lived, the strongest servant of the Light, that no other is likely to take up your wand if you lay it down.”
To put the first quotation into some context, Eliezer argued that his combination of high SAT scores and spending a lot of effort in studying AI puts him in a unique position that can make a “difference between cracking the problem of intelligence in five years and cracking it in twenty-five”. (Which could make a huge difference, if it saves Earth from destruction by nanotechnology, presumably coming during that interval...)
Of course, knowing that it was written in 2000, the five-years estimate was obviously wrong. And there is a Sequence about it, which explains that Friendly AI is more complicated than just any AI. (Which doesn’t prove that the five-years estimate would be correct for any AI.)
Most people very seriously studying AI probably have high SATs too. High IQs. High lots of things. And some likely have other unique qualities and advantages that Eliezer doesn’t.
Unique in some qualities doesn’t mean uniquely capable of the task in some timeline.
My main objection is that until it’s done, I don’t think people are very justified in claims to know what it will take to get done, and therefore unjustified in claiming some particular person is best able to do it, even if he is best suited to pursue one particular approach to the problem.
Hence, I conclude he is overestimating his importance, per the definition. Not that I see it as some heinous crime. He’s over confident. So what? It seems to be an ingredient to high achievement. Better to be over confident epistemologically than under confident instrumentally.
I’d say that’s, at the very least, an oversimplification; when you look at the architecture of organizations generally recognized as cults, you end up finding they share a fairly specific cluster of cultural characteristics, one that has more to do with internal organization than claims of certainty. My favorite framework for this is the amusingly named ABCDEF: though aimed at new religions in the neopagan space, it’s general enough to be applied outside it.
Sorry. It wasn’t meant as an attack, just something that came to my mind reading the comment by Chris Hallquist.
Well, I’m sorry but when you dig up quotes of your opponent to demonstrate purported flaws in his character, it is a personal attack. I didn’t expect to encounter this sort of thing in LessWrong. Given the number of upvotes your comment received, I can understand why Eliezer prefers Facebook.
He is a forum moderator. He asks people for money. He wants to create the core of the future machine dictator that is supposed to rule the universe.
Given the above, I believe that remarks about his personality are warranted, and not attacks, if they are backed up by evidence (which I provided in other comments above).
But note that in my initial comment, which got this discussion started, I merely uttered a guess on why Yudowsky might now prefer Facebook over LessWrong. Then a comment forced me to escalate this by providing further justification for uttering this guess. Your comments further forced me to explain myself. Which resulted in a whole thread about Yudkowsky’s personality.
I call it an ignoble personal attack which has no place on this forum.
Sorry. It wasn’t meant as an attack, just something that came to my mind reading the comment by Chris Hallquist.
My initial reply was based on the following comment by Yudkowsky:
And regarding narcissism, the definition is: “an inflated sense of one’s own importance and a deep need for admiration.”
See e.g. this conversation between Ben Goertzel and Eliezer Yudkowsky (note that MIRI was formerly known as SIAI):
Also see e.g. this comment by Yudkowsky:
...and from his post...
And this kind of attitude started early. See for example what he wrote in his early “biography”:
Also see this video:
That’s the dictionary definition. When throwing around accusations of mental pathology, though, it behooves one not to rely on pattern-matching to one-sentence definitions; it overestimates the prevalence of problems, suggests the wrong approaches to them, and tends to be considered rude.
Having a lot of ambition and an overly optimistic view of intelligence in general and one’s own intelligence in particular doesn’t make you a narcissist, or every fifteen-year-old nerd in the world would be a narcissist.
(That said, I’m not too impressed with Eliezer’s reasons for moving to Facebook.)
I feel that similar accusation could be used against anyone who feels that more is possible and instead of whining tries to win.
I am not an expert on narcissism (though I could be expert at it, heh), but seems to me that a typical narcissistic person would feel they deserve admiration without doing anything awesome. They probably wouldn’t be able to work hard, for years. (But as I said, I am not an expert; there could be multiple types of narcissism.)
Thinking that one person is going to save the world, and you’re him, qualifies as “an inflated sense of one’s own importance”, IMO.
First mistake: believing that one person will be saving the world. Second mistake: there is likely only one person that can do it, and he’s that person.
To put the first quotation into some context, Eliezer argued that his combination of high SAT scores and spending a lot of effort in studying AI puts him in a unique position that can make a “difference between cracking the problem of intelligence in five years and cracking it in twenty-five”. (Which could make a huge difference, if it saves Earth from destruction by nanotechnology, presumably coming during that interval...)
Of course, knowing that it was written in 2000, the five-years estimate was obviously wrong. And there is a Sequence about it, which explains that Friendly AI is more complicated than just any AI. (Which doesn’t prove that the five-years estimate would be correct for any AI.)
Most people very seriously studying AI probably have high SATs too. High IQs. High lots of things. And some likely have other unique qualities and advantages that Eliezer doesn’t.
Unique in some qualities doesn’t mean uniquely capable of the task in some timeline.
My main objection is that until it’s done, I don’t think people are very justified in claims to know what it will take to get done, and therefore unjustified in claiming some particular person is best able to do it, even if he is best suited to pursue one particular approach to the problem.
Hence, I conclude he is overestimating his importance, per the definition. Not that I see it as some heinous crime. He’s over confident. So what? It seems to be an ingredient to high achievement. Better to be over confident epistemologically than under confident instrumentally.
Private overconfidence is harmless. Public overconfidence is how cults start.
I’d say that’s, at the very least, an oversimplification; when you look at the architecture of organizations generally recognized as cults, you end up finding they share a fairly specific cluster of cultural characteristics, one that has more to do with internal organization than claims of certainty. My favorite framework for this is the amusingly named ABCDEF: though aimed at new religions in the neopagan space, it’s general enough to be applied outside it.
(Eliezer, of course, would say that every cause wants to be a cult. I think he’s being too free with the word, myself.)
Well, I’m sorry but when you dig up quotes of your opponent to demonstrate purported flaws in his character, it is a personal attack. I didn’t expect to encounter this sort of thing in LessWrong. Given the number of upvotes your comment received, I can understand why Eliezer prefers Facebook.
Yudkowsky tells other people to get laid. He is asking the community to downvote certain people. He is calling people permanent idiots.
He is a forum moderator. He asks people for money. He wants to create the core of the future machine dictator that is supposed to rule the universe.
Given the above, I believe that remarks about his personality are warranted, and not attacks, if they are backed up by evidence (which I provided in other comments above).
But note that in my initial comment, which got this discussion started, I merely uttered a guess on why Yudowsky might now prefer Facebook over LessWrong. Then a comment forced me to escalate this by providing further justification for uttering this guess. Your comments further forced me to explain myself. Which resulted in a whole thread about Yudkowsky’s personality.