Calling Eliezer Yudkowsky one of the world’s foremost intellects is the kind of cult-like behaviour that gives LW a bad reputation in some rationalist circles. He’s one of the foremost Harry Potter fanfiction authors and a prolific blogger, who has also authored a very few minor papers. He’s a smart guy but there are a lot of smart guys in the world.
He articulates very important ideas, but so do very many teachers of economics, ethics, philosophy and so on. That does not make them very important people (although the halo effect makes some students think so).
(Edited to spell Eliezer’s name correctly, with thanks for the correction).
Consider a hypothetical world in which Eliezer Yudkowsky actually is, by objective standards, one of the world’s foremost intellects. In such a hypothetical world, would it be “cult-like” behavior to make this claim? And again, in this hypothetical world, do you care about having a bad reputation in alleged “rationalist circles” that do not believe in the objective truth?
The argument seems to be that some “rationalist circles” are so deeply affected by the non-central fallacy (excessive attention to one individual --> cult, cult--> kool aid) , that in order to avoid alienating them, we should refrain from saying certain things out loud.
I will say this for the record: Eliezer Yudkowsky is sometimes wrong. I often disagree with him. But his basic world view is fundamentally correct in important ways where the mainstream of intellectuals are wrong. Eliezer has started a discussion that is at the cutting edge of current intellectual discourse. That makes him one of the world’s foremost intellectuals.
In a world where Eliezer is by objective standards X, then in that world it is correct to say he is X, for any X. That X could be “one of the world’s foremost intellectuals” or “a moose” and the argument still stands.
To establish whether it is objectively true that “his basic world view is fundamentally correct in important ways where the mainstream of intellectuals are wrong” would be beyond the scope of the thread, I think, but I think the mainstream has good grounds to question both those sub-claims. Worrying about steep-curve AI development might well be fundamentally incorrect as opposed to fundamentally correct, for example, and if the former is true then Eliezer is fundamentally wrong. You might also be wrong about what mainstream intellectuals think. For example the bitter struggle between frequentism and Bayesianism is almost totally imaginary, so endorsing Bayesianism is not going against the mainstream.
Perhaps more fundamentally, literally anything published in the applied analytic philosophy literature is just as much at the cutting edge of current intellectual discourse as Yudkowsky’s work. So your proposed definition fails to pick him out as being special, unless every published applied analytic philosopher is also one of the world’s foremost intellectuals.
My point is that the statement “Eliezer is one of the world’s foremost intellectuals” is a proposition with a truth value. We should argue about the truth value of that proposition, not about how our beliefs might affect our status in the eyes of another rationalist group, particularly if that “rationalist” group assigns status based on obvious fallacies.
I assign a high prior belief to the statement. If I didn’t, I wouldn’t waste my time on Less Wrong. I believe this is also true for many of the other participants, who just don’t want to say it out loud. You can argue that we should try to hide our true beliefs in order to avoid signaling low status, but given how seriously we take this website, it would be very difficult to send a credible signal. To most intelligent observers, it would be obvious that we are sending a false signal for status reason, which is inconsistent with our own basic standards for discussion
It’s a proposition with a truth value in a sense, but if we are disagreeing about the topic then it seems most likely that the term “one of the world’s foremost intellectuals” is ambiguous enough that elucidating what we mean by the term is necessary before we can worry about the truth value.
Obviously I think that the truth value is false, and so obviously so that it needs little further argument to establish the implied claim that it is rational to think that calling Eliezer “one of the world’s foremost intellectuals” is cult-like and that is is rational to place a low value on a rationalist forum if it is cult-like.
So the question is how you are defining “one of the world’s foremost intellectuals”? I tend to define it as a very small group of very elite thinkers, typically people in their fifties or later with outstanding careers who have made major contributions to human knowledge or ethics.
Calling Eliezer Yudkowsky one of the world’s foremost intellects is the kind of cult-like behaviour that gives LW a bad reputation in some rationalist circles. He’s one of the foremost Harry Potter fanfiction authors and a prolific blogger, who has also authored a very few minor papers. He’s a smart guy but there are a lot of smart guys in the world.
He articulates very important ideas, but so do very many teachers of economics, ethics, philosophy and so on. That does not make them very important people (although the halo effect makes some students think so).
(Edited to spell Eliezer’s name correctly, with thanks for the correction).
I agree with what you said, but I think you should do him the courtesy of spelling his name correctly. (Yudkowsky.)
Consider a hypothetical world in which Eliezer Yudkowsky actually is, by objective standards, one of the world’s foremost intellects. In such a hypothetical world, would it be “cult-like” behavior to make this claim? And again, in this hypothetical world, do you care about having a bad reputation in alleged “rationalist circles” that do not believe in the objective truth?
The argument seems to be that some “rationalist circles” are so deeply affected by the non-central fallacy (excessive attention to one individual --> cult, cult--> kool aid) , that in order to avoid alienating them, we should refrain from saying certain things out loud.
I will say this for the record: Eliezer Yudkowsky is sometimes wrong. I often disagree with him. But his basic world view is fundamentally correct in important ways where the mainstream of intellectuals are wrong. Eliezer has started a discussion that is at the cutting edge of current intellectual discourse. That makes him one of the world’s foremost intellectuals.
In a world where Eliezer is by objective standards X, then in that world it is correct to say he is X, for any X. That X could be “one of the world’s foremost intellectuals” or “a moose” and the argument still stands.
To establish whether it is objectively true that “his basic world view is fundamentally correct in important ways where the mainstream of intellectuals are wrong” would be beyond the scope of the thread, I think, but I think the mainstream has good grounds to question both those sub-claims. Worrying about steep-curve AI development might well be fundamentally incorrect as opposed to fundamentally correct, for example, and if the former is true then Eliezer is fundamentally wrong. You might also be wrong about what mainstream intellectuals think. For example the bitter struggle between frequentism and Bayesianism is almost totally imaginary, so endorsing Bayesianism is not going against the mainstream.
Perhaps more fundamentally, literally anything published in the applied analytic philosophy literature is just as much at the cutting edge of current intellectual discourse as Yudkowsky’s work. So your proposed definition fails to pick him out as being special, unless every published applied analytic philosopher is also one of the world’s foremost intellectuals.
My point is that the statement “Eliezer is one of the world’s foremost intellectuals” is a proposition with a truth value. We should argue about the truth value of that proposition, not about how our beliefs might affect our status in the eyes of another rationalist group, particularly if that “rationalist” group assigns status based on obvious fallacies.
I assign a high prior belief to the statement. If I didn’t, I wouldn’t waste my time on Less Wrong. I believe this is also true for many of the other participants, who just don’t want to say it out loud. You can argue that we should try to hide our true beliefs in order to avoid signaling low status, but given how seriously we take this website, it would be very difficult to send a credible signal. To most intelligent observers, it would be obvious that we are sending a false signal for status reason, which is inconsistent with our own basic standards for discussion
It’s a proposition with a truth value in a sense, but if we are disagreeing about the topic then it seems most likely that the term “one of the world’s foremost intellectuals” is ambiguous enough that elucidating what we mean by the term is necessary before we can worry about the truth value.
Obviously I think that the truth value is false, and so obviously so that it needs little further argument to establish the implied claim that it is rational to think that calling Eliezer “one of the world’s foremost intellectuals” is cult-like and that is is rational to place a low value on a rationalist forum if it is cult-like.
So the question is how you are defining “one of the world’s foremost intellectuals”? I tend to define it as a very small group of very elite thinkers, typically people in their fifties or later with outstanding careers who have made major contributions to human knowledge or ethics.