Is it at all meaningful to you that EY writes this in his homepage?
You should regard anything from 2001 or earlier as having been written by a different person who also happens to be named “Eliezer Yudkowsky”. I do not share his opinions.
It is true that EY has a big ego, but he also has the ability to renounce past opinions and admit his mistakes.
In the meantime he wrote the Sequences and HPMoR, and founded MIRI and CFAR. So maybe the distance between his ego and his real output got smaller.
Also, as Eliezer mentions in the Sequences, he used to have an “affective death spiral” about “intelligence”, which is probably visible in his old writings, and contributes to the reader’s perception of “big ego”.
I don’t really mind big egos as long as they drive people to produce something useful. (Yeah, we could have a separate debate about how much MIRI or HPMoR are really useful. But the old writings would be irrelevant for that debate.)
“But look at all this awesome fan fiction, and furthermore this ‘big ego’ is all your perception anyways, and furthermore I don’t even mind it.”
Why so defensive about EY’s very common character flaws (which don’t really require any exotic explanation, btw, e.g. think horses not zebras)? They don’t reflect poorly on you.
I’m defensive about digging in people’s past, only to laugh that as teenagers they had the usual teenage hubris, and maybe as highly intelligent people they kept it for a few more years… and then use it to hint that even today ‘deeply inside’ they are ‘essentially the same’, i.e. not worth to be taken seriously.
What exactly are we punishing here; what exactly are we rewarding?
Ten or more years ago I also had a few weird ideas. My advantage is that I didn’t publish them on visible places in English, and that I didn’t become famous enough so people would now spend their time digging in my past. Also, I kept most of my ideas to myself, because I didn’t try to organize people into anything. I didn’t keep a regular diary, and when I find some old notes, I usually just cringe and quickly destroy them.
(So no, I don’t care about any of Eliezer’s flaws reflecting on me, or anything like that. Instead I imagine myself in a parallel universe, where I was more agenty and perhaps less introverted, so I started to spread my ideas sooner and wider, had the courage to try changing the world, and now people are digging up similar kinds of my writings. Generally, this is a mechanism for ruining sincere people’s reputations: find something they wrote when they were just as sincere as now only less smart, and make people focus on that instead of what they are saying today.)
I guess I am oversensitive about this, because “pointing out that I failed at something a few years ago, therefore I shouldn’t be trusted to do it, ever” was something my mother often did to me while I was a teenager. People grow up, damn it! It’s not like once a baby, always a baby.
Everyone was a baby once. The difference is that for some people you have the records, and for other people you don’t; so you can imagine that the former are still ‘deep inside’ baby-like and the latter are not. But that’s confusing the map with the territory. As the saying goes, “an expert is a person who came from another city” (so you have never seen their younger self.). As the fictional evidence proves, you could have literally godlike powers, and people would still diss you if they knew you as a kid. But today on internet, everything is one big city, and anything you say can get documented forever. (Knowing this, I will forbid my children to use their real names online. Which probably will not help enough, because twenty years later there will be other methods for easily digging in people’s past.)
Ah, whatever. It’s already linked here anyway. So if it makes you feel better about yourself (returning the courtesy of online psychoanalysis) to read stupid stuff Eliezer wrote in the past, go ahead!
EDIT: I also see this as a part of a larger trend of intelligent people focusing too much on attacking each other instead of doing something meaningful. I understand the game-theoretical reasons for that (often it is easier to get status by attacking other people’s work than presenting your own), but I don’t want to support that trend.
EY is not a baby, and was not a baby in the time period under discussion. He is in his mid thirties today.
I have zero interest in gaining status in the LW/rationalist community. I already won the status tournament I care about. I have no interest in “crabbing” for that reason. I have no interest in being a “guru” to anyone. I am not EY’s competitor, I am involved in a different game.
Whether me being free of the confounding influence of status in this context makes me a more reliable narrator I will let you decide.
What I am very interested in is decomposing cult behavior into constituent pieces to try to understand why it happens. This is what makes LW/rationalists so fascinating to me—not quite a cult in the standard Scientology sense, but there is definitely something there.
Mid thirties in 2015 means about twenty in 2001 (the date of most of the linked archives), right? That’s halfway to baby from where I am now. Some of my cringeworthy diaries were written in my mid twenties.
This is what makes LW/rationalists so fascinating to me
Welcome to the zoo! Please do not poke the animals with sticks of throw things at them to attract their attention. Do not push fingers or other object through the fences. We would also ask you not to feed the animals as it might lead to digestive problems.
Using claim of immunity to status and authority games as evidence to assert a claim.
Which is to say, you are using a claim of immunity to status and authority games to assert status and authority.
Yes, that’s right out of my own playbook, too. I welcome anybody who catches me at it to downvote me, and please let me know I’ve done it, as it is an insidious logical mistake I find it impossible to catch myself at.
Using claim of immunity to status and authority games as evidence to assert a claim.
Which is to say, you are using a claim of immunity to status and authority games to assert status and authority.
Asserting a claim is not the same thing as asserting status and authority.
I’m not sure what you want from Ilya here. He seems to be describing his motivations in good faith. Do you think he’s lying to gain status? Do you think he’s telling the truth, but gaining status as a side effect, and he shouldn’t do that?
Quick edit: Oh, I should probably have read the rest of the thread. I think I understand your objection now, but I disagree with it.
I am not claiming status and authority (I don’t want it), I am saying EY has a big ego. I don’t think I need status and authority for that, right?
Say I did gain status and authority on LW. What would I do with it? I don’t go to meetups, I hardly interact with the rationalist community in real life. What is this supposed status going to buy me, in practice? I am not trying to get laid. I am not looking to lead anybody, or live in a ‘rationalist house,’ or write long posts read by the community. Forget status, I don’t even claim to be a community member, really.
I care about status in the context relevant to me (my academic community, for example, or my workplace).
Or, to put it simply, you guys are not my tribe. I just don’t care enough about status here.
You’re claiming to have status and authority to make a particular claim about reality—“Outsider” status, a status which gains you, with respect to adjucation of insider status and authority games… status and authority.
Now, your argument could stand or fall on its own merits, but you’ve chosen not to permit this, and instead have argued that you should be taken seriously on the merits of your personal relationship to the group (read: taken to have status and authority relative to the group, at least with respect to this claim).
You’re claiming to have status and authority to make a particular claim about reality
I am? Is that how we are evaluating claims now?
Here is how this conversation played out (roughly paraphrased):
me: EY has a big ego.
Viliam: I wish you would stop digging up people’s youthful indiscretions like that. Why not go do impressive things instead, why be a hater?
me: EY wasn’t young in the time period involved. Also, I have my own stuff going on, thanks! Also, I think this EY dynamic isn’t healthy.
you: Argument from status!
me: Don’t really want status here, have my own already.
you: You are claiming status by signaling you don’t want/need status here! And then using that to make claims!
(At this point if I claim status I lose, and if I don’t claim status I also lose.)
Well, look. Grandiose dimensions of EY’s ego are not a secret to anyone who actually knows him, I don’t think. I think slatestar even wrote something about that.
If you don’t think I am being straight with you, and I am playing some unstated game, that’s ok. If you have time and inclination, you can dig around my post history and try to figure that out if you care. I would be curious what you find.
I think it is fair to call myself an outsider. I don’t self-identify as rationalist, and I don’t get any sort of emotional reaction when people attack rationalists (which is how you know what your tribe is). I don’t think rationalists are evil mutants, but I think unhealthy things are going on in this community. You can listen to people like me, or not. I think you should, but ultimately your beliefs are your own business. I am not going to spend a ton of energy convincing you.
If you don’t think I am being straight with you, and I am playing some unstated game, that’s ok.
I think you’re being as completely straight and honest as you are humanly capable of being. I think you also overestimate the degree to which you’re capable of being straight and honest. What’s your straightest and most honest answer to the question of what probability you assign to the possibility that your actions can be influenced by subconscious status concerns?
Which is to say: Status games are a bias. You’re claiming to be above bias. I believe you believe that, but I don’t believe that.
As I said, I don’t think rationalists are actually a cult in the way that Scientology is a cult. But I think there are some cult-like characteristics to the rationalist movement (and a big part of this is EY’s position in the movement).
And I think it would be a good idea for the movement to become more like colleagues, and less like what they are now. What I think is somewhat disappointing is both EY and a fair bit of rank and file like things as they are.
I don’t know if this matters. I don’t particularly care for the Sequences, but that hasn’t caused me any problems at all. LessWrong has been an easy site to get into and to learn from, and would be even if I never read anything by EY. (This seems to be true for most aspects of the site; LessWrong is useful even if you don’t care about AIs, transhumanism, cybernetics, effective altruism.… there’s enough here that you can find plenty to learn.)
You may be seeing the problem as bigger than it is because of the lens that you are looking through, although I agree that charisma is an interesting thing to study, and was central to the development of the site.
You might be an exception, but empirically speaking people tend to value their status in online communities, including communities members of which they will never meet in meatspace and which have no effect on their work/personal/etc. life.
Biologically hardwired instincts are hard to transcend :-/
I think one difference is, I am a bit older than a typical LW member, and have someplace to “hang my hat” already. As one gets older and more successful, one gets less status-anxious.
Ilya’s comments about status could indeed be explained by the hypothesis that he’s attempting some kind of sneaky second-order status manoeuvre. They could also be explained by his meaning what he says and genuinely not caring much (consciously or otherwise) about status here on LW. To me, the second looks at least as plausible as the first.
More precisely: I doubt anyone is ever completely 100% unaffected by status considerations; the question is how much; Ilya’s claim is that in this context the answer is “negligibly”; and I suggest that that could well be correct.
You may be correct to say it isn’t. But if so, it isn’t enough just to observe that someone motivated by status might say the things Ilya has, because so might someone who in this context is only negligibly motivated by status. You need either to show us something Ilya’s doing that’s substantially better explained in status-seeking terms, or else give a reason why we should think him much more likely to be substantially status-seeking than not a priori.
[EDITED to add: I have no very strong opinion on whether and to what degree Ilya’s comments here are status manoeuvres.]
He literally wrote plans about what he would do with the billions of dollars the singularity institute would be bringing in by 2005 using the words ‘silicon crusade’ to describe its actions to bring about the singularity and interstellar supercivilization by 2010 so as to avoid the apocalyptic nanotech war that would have started by then without their guidance. He also went on and on and on about his SAT scores in middle school (which are lower than those of one of my friends, taken via the same program at the same age) and how they proved he is a mutant supergenius who is the only possible person who can save the world.
I don’t think EY’s ego got any smaller with time.
Is it at all meaningful to you that EY writes this in his homepage?
It is true that EY has a big ego, but he also has the ability to renounce past opinions and admit his mistakes.
Absolutely, it is meaningful.
I can hardly wait to look back on his ‘shameless blegging’ post in a few years and compare it to reality. Pretty sure I know what the result will be.
In the meantime he wrote the Sequences and HPMoR, and founded MIRI and CFAR. So maybe the distance between his ego and his real output got smaller.
Also, as Eliezer mentions in the Sequences, he used to have an “affective death spiral” about “intelligence”, which is probably visible in his old writings, and contributes to the reader’s perception of “big ego”.
I don’t really mind big egos as long as they drive people to produce something useful. (Yeah, we could have a separate debate about how much MIRI or HPMoR are really useful. But the old writings would be irrelevant for that debate.)
Here is what you sound like:
“But look at all this awesome fan fiction, and furthermore this ‘big ego’ is all your perception anyways, and furthermore I don’t even mind it.”
Why so defensive about EY’s very common character flaws (which don’t really require any exotic explanation, btw, e.g. think horses not zebras)? They don’t reflect poorly on you.
EY’s past stuff is evidence.
I’m defensive about digging in people’s past, only to laugh that as teenagers they had the usual teenage hubris, and maybe as highly intelligent people they kept it for a few more years… and then use it to hint that even today ‘deeply inside’ they are ‘essentially the same’, i.e. not worth to be taken seriously.
What exactly are we punishing here; what exactly are we rewarding?
Ten or more years ago I also had a few weird ideas. My advantage is that I didn’t publish them on visible places in English, and that I didn’t become famous enough so people would now spend their time digging in my past. Also, I kept most of my ideas to myself, because I didn’t try to organize people into anything. I didn’t keep a regular diary, and when I find some old notes, I usually just cringe and quickly destroy them.
(So no, I don’t care about any of Eliezer’s flaws reflecting on me, or anything like that. Instead I imagine myself in a parallel universe, where I was more agenty and perhaps less introverted, so I started to spread my ideas sooner and wider, had the courage to try changing the world, and now people are digging up similar kinds of my writings. Generally, this is a mechanism for ruining sincere people’s reputations: find something they wrote when they were just as sincere as now only less smart, and make people focus on that instead of what they are saying today.)
I guess I am oversensitive about this, because “pointing out that I failed at something a few years ago, therefore I shouldn’t be trusted to do it, ever” was something my mother often did to me while I was a teenager. People grow up, damn it! It’s not like once a baby, always a baby.
Everyone was a baby once. The difference is that for some people you have the records, and for other people you don’t; so you can imagine that the former are still ‘deep inside’ baby-like and the latter are not. But that’s confusing the map with the territory. As the saying goes, “an expert is a person who came from another city” (so you have never seen their younger self.). As the fictional evidence proves, you could have literally godlike powers, and people would still diss you if they knew you as a kid. But today on internet, everything is one big city, and anything you say can get documented forever. (Knowing this, I will forbid my children to use their real names online. Which probably will not help enough, because twenty years later there will be other methods for easily digging in people’s past.)
Ah, whatever. It’s already linked here anyway. So if it makes you feel better about yourself (returning the courtesy of online psychoanalysis) to read stupid stuff Eliezer wrote in the past, go ahead!
EDIT: I also see this as a part of a larger trend of intelligent people focusing too much on attacking each other instead of doing something meaningful. I understand the game-theoretical reasons for that (often it is easier to get status by attacking other people’s work than presenting your own), but I don’t want to support that trend.
EY is not a baby, and was not a baby in the time period under discussion. He is in his mid thirties today.
I have zero interest in gaining status in the LW/rationalist community. I already won the status tournament I care about. I have no interest in “crabbing” for that reason. I have no interest in being a “guru” to anyone. I am not EY’s competitor, I am involved in a different game.
Whether me being free of the confounding influence of status in this context makes me a more reliable narrator I will let you decide.
What I am very interested in is decomposing cult behavior into constituent pieces to try to understand why it happens. This is what makes LW/rationalists so fascinating to me—not quite a cult in the standard Scientology sense, but there is definitely something there.
Mid thirties in 2015 means about twenty in 2001 (the date of most of the linked archives), right? That’s halfway to baby from where I am now. Some of my cringeworthy diaries were written in my mid twenties.
Welcome to the zoo! Please do not poke the animals with sticks of throw things at them to attract their attention. Do not push fingers or other object through the fences. We would also ask you not to feed the animals as it might lead to digestive problems.
It’s an interesting zoo, where all the exhibits think they’re the ones visiting and observing...
The true observers we’ll never know, because by definition they are not commenting here.
Of course :-)
Downvote explanation:
Using claim of immunity to status and authority games as evidence to assert a claim.
Which is to say, you are using a claim of immunity to status and authority games to assert status and authority.
Yes, that’s right out of my own playbook, too. I welcome anybody who catches me at it to downvote me, and please let me know I’ve done it, as it is an insidious logical mistake I find it impossible to catch myself at.
I don’t understand your objection.
Asserting a claim is not the same thing as asserting status and authority.
I’m not sure what you want from Ilya here. He seems to be describing his motivations in good faith. Do you think he’s lying to gain status? Do you think he’s telling the truth, but gaining status as a side effect, and he shouldn’t do that?
Quick edit: Oh, I should probably have read the rest of the thread. I think I understand your objection now, but I disagree with it.
I am not claiming status and authority (I don’t want it), I am saying EY has a big ego. I don’t think I need status and authority for that, right?
Say I did gain status and authority on LW. What would I do with it? I don’t go to meetups, I hardly interact with the rationalist community in real life. What is this supposed status going to buy me, in practice? I am not trying to get laid. I am not looking to lead anybody, or live in a ‘rationalist house,’ or write long posts read by the community. Forget status, I don’t even claim to be a community member, really.
I care about status in the context relevant to me (my academic community, for example, or my workplace).
Or, to put it simply, you guys are not my tribe. I just don’t care enough about status here.
You’re claiming to have status and authority to make a particular claim about reality—“Outsider” status, a status which gains you, with respect to adjucation of insider status and authority games… status and authority.
Now, your argument could stand or fall on its own merits, but you’ve chosen not to permit this, and instead have argued that you should be taken seriously on the merits of your personal relationship to the group (read: taken to have status and authority relative to the group, at least with respect to this claim).
[edit: I did not downvote anyone in this thread.]
I am? Is that how we are evaluating claims now?
Here is how this conversation played out (roughly paraphrased):
me: EY has a big ego.
Viliam: I wish you would stop digging up people’s youthful indiscretions like that. Why not go do impressive things instead, why be a hater?
me: EY wasn’t young in the time period involved. Also, I have my own stuff going on, thanks! Also, I think this EY dynamic isn’t healthy.
you: Argument from status!
me: Don’t really want status here, have my own already.
you: You are claiming status by signaling you don’t want/need status here! And then using that to make claims!
(At this point if I claim status I lose, and if I don’t claim status I also lose.)
Well, look. Grandiose dimensions of EY’s ego are not a secret to anyone who actually knows him, I don’t think. I think slatestar even wrote something about that.
If you don’t think I am being straight with you, and I am playing some unstated game, that’s ok. If you have time and inclination, you can dig around my post history and try to figure that out if you care. I would be curious what you find.
I think it is fair to call myself an outsider. I don’t self-identify as rationalist, and I don’t get any sort of emotional reaction when people attack rationalists (which is how you know what your tribe is). I don’t think rationalists are evil mutants, but I think unhealthy things are going on in this community. You can listen to people like me, or not. I think you should, but ultimately your beliefs are your own business. I am not going to spend a ton of energy convincing you.
I think you’re being as completely straight and honest as you are humanly capable of being. I think you also overestimate the degree to which you’re capable of being straight and honest. What’s your straightest and most honest answer to the question of what probability you assign to the possibility that your actions can be influenced by subconscious status concerns?
Which is to say: Status games are a bias. You’re claiming to be above bias. I believe you believe that, but I don’t believe that.
Please elaborate.
As I said, I don’t think rationalists are actually a cult in the way that Scientology is a cult. But I think there are some cult-like characteristics to the rationalist movement (and a big part of this is EY’s position in the movement).
And I think it would be a good idea for the movement to become more like colleagues, and less like what they are now. What I think is somewhat disappointing is both EY and a fair bit of rank and file like things as they are.
I don’t know if this matters. I don’t particularly care for the Sequences, but that hasn’t caused me any problems at all. LessWrong has been an easy site to get into and to learn from, and would be even if I never read anything by EY. (This seems to be true for most aspects of the site; LessWrong is useful even if you don’t care about AIs, transhumanism, cybernetics, effective altruism.… there’s enough here that you can find plenty to learn.)
You may be seeing the problem as bigger than it is because of the lens that you are looking through, although I agree that charisma is an interesting thing to study, and was central to the development of the site.
It’s not just LW, it’s the invisible social organization around it.
“Culty” dynamics matter. It’s dangerous stuff to be playing with.
Bask in the glory? :-)
You might be an exception, but empirically speaking people tend to value their status in online communities, including communities members of which they will never meet in meatspace and which have no effect on their work/personal/etc. life.
Biologically hardwired instincts are hard to transcend :-/
I think one difference is, I am a bit older than a typical LW member, and have someplace to “hang my hat” already. As one gets older and more successful, one gets less status-anxious.
Which is why you’re spending time assuring us that you’re high-status?
Ilya’s comments about status could indeed be explained by the hypothesis that he’s attempting some kind of sneaky second-order status manoeuvre. They could also be explained by his meaning what he says and genuinely not caring much (consciously or otherwise) about status here on LW. To me, the second looks at least as plausible as the first.
More precisely: I doubt anyone is ever completely 100% unaffected by status considerations; the question is how much; Ilya’s claim is that in this context the answer is “negligibly”; and I suggest that that could well be correct.
You may be correct to say it isn’t. But if so, it isn’t enough just to observe that someone motivated by status might say the things Ilya has, because so might someone who in this context is only negligibly motivated by status. You need either to show us something Ilya’s doing that’s substantially better explained in status-seeking terms, or else give a reason why we should think him much more likely to be substantially status-seeking than not a priori.
[EDITED to add: I have no very strong opinion on whether and to what degree Ilya’s comments here are status manoeuvres.]
He literally wrote plans about what he would do with the billions of dollars the singularity institute would be bringing in by 2005 using the words ‘silicon crusade’ to describe its actions to bring about the singularity and interstellar supercivilization by 2010 so as to avoid the apocalyptic nanotech war that would have started by then without their guidance. He also went on and on and on about his SAT scores in middle school (which are lower than those of one of my friends, taken via the same program at the same age) and how they proved he is a mutant supergenius who is the only possible person who can save the world.
I am distinctly unimpressed.