Thank you for your thoughtful reply; although, as will be evident, I’m not quite sure I actually got the point across.
(But why wouldn’t they as well, if they’re “smart”?)
It’s not clear that willingness to listen to strange-sounding claims exhibits correlation with instrumental rationality,
I didn’t realize at all that by “smart” you meant “instrumentally rational”; I was thinking rather more literally in terms of IQ. And I would indeed expect IQ to correlate positively with what you might call openness. More precisely, although I would expect openness to be only weak evidence of high IQ, I would expect high IQ to be more significant evidence of openness.
People who are willing to listen to strange-sounding claims statistically end up hanging out with UFO conspiracy theorists, New Age people, etc...
Why can’t they just read the darn sequences and pick up on the fact that these people are worth listening to?
See my remarks above.
The point of my comment was that reading his writings reveals a huge difference between Eliezer and UFO conspiracy theorists, a difference that should be more than noticeable to anyone with an IQ high enough to be in graduate school in mathematics. Yes, of course, if all you know about a person is that they make strange claims, then you should by default assume they’re a UFO/New Age type. But I submit that the fact that Eliezer has written thingslikethese decisively entitles him to a pass on that particular inference, and anyone who doesn’t grant it to him just isn’t very discriminating.
And I would indeed expect IQ to correlate positively with what you might call openness.
My own experience is that the correlation is not very high. Most of the people who I’ve met who are as smart as me (e.g. in the sense of having high IQ) are not nearly as open as I am.
I didn’t realize at all that by “smart” you meant “instrumentally rational”;
I did not intend to equate intelligence with instrumental rationality. The reason why I mentioned instrumental rationality is that ultimately what matters is to get people with high instrumental rationality (whether they’re open minded or not) interested in existential risk.
My point is that people who are closed minded should not be barred from consideration as potentially useful existential risk researchers, that although people are being irrational to dismiss Eliezer as fast as they do, that doesn’t mean that they’re holistically irrational. My own experience has been that my openness has both benefits and drawbacks.
The point of my comment was that reading his writings reveals a huge difference between Eliezer and UFO conspiracy theorists, a difference that should be more than noticeable to anyone with an IQ high enough to be in graduate school in mathematics.
Math grad students can see a huge difference between Eliezer and UFO conspiracy theorists—they recognize that Eliezer’s intellectually sophisticated. They’re still biased to dismiss him out of hand. See bentram’s comment
Edit: You might wonder where the bias to dismiss Eliezer comes from. I think it comes mostly from conformity, which is, sadly, very high even among very smart people.
My point is that people who are closed minded should not be barred from consideration as potentially useful existential risk researchers
You may be right about this; perhaps Eliezer should in fact work on his PR skills. At the same time, we shouldn’t underestimate the difficulty of “recruiting” folks who are inclined to be conformists; unless there’s a major change in the general sanity level of the population, x-risk talk is inevitably going to sound “weird”.
Math grad students can see a huge difference between Eliezer and UFO conspiracy theorists—they recognize that Eliezer’s intellectually sophisticated. They’re still biased to dismiss him out of hand
At the same time we shouldn’t underestimate the difficulty of “recruiting” folks who are inclined to be conformists; unless there’s a major change in the general sanity level of the population, x-risk talk is inevitably going to sound “weird”.
I agree with this. It’s all a matter of degree. Maybe at present one has to be in the top 1% of the population in nonconformity to be interested in existential risk and with better PR one could reduce the level of nonconformity required to the top 5% level.
(I don’t know whether these numbers are right, but this is the sort of thing that I have in mind—I find it very likely that there are people who are nonconformist enough to potentially be interested in existential risk but too conformist to take it seriously unless the people who are involved seem highly credible.)
Edit: You might wonder where the bias to dismiss Eliezer comes from. I think it comes mostly from conformity, which is, sadly, very high even among very smart people.
I would perhaps expand ‘conformity’ to include neighbouring social factors—in-group/outgroup, personal affiliation/alliances, territorialism, etc.
One more point—though I could immediately recognize that there’s something important to some of what Eliezer says, the fact that he makes outlandish claims did make me take longer to get around to thinking seriously about existential risk. This is because of a factor that I mention in my post which I quote below.
There is also a social effect which compounds the issue which I just mentioned. The issue which I just mentioned makes people who are not directly influenced by the issue that I just mentioned less likely to think seriously about existential risk on account of their desire to avoid being perceived as associated with claims that people find uncredible.
I’m not proud that I’m so influenced, but I’m only human. I find it very plausible that there are others like me.
Thank you for your thoughtful reply; although, as will be evident, I’m not quite sure I actually got the point across.
I didn’t realize at all that by “smart” you meant “instrumentally rational”; I was thinking rather more literally in terms of IQ. And I would indeed expect IQ to correlate positively with what you might call openness. More precisely, although I would expect openness to be only weak evidence of high IQ, I would expect high IQ to be more significant evidence of openness.
The point of my comment was that reading his writings reveals a huge difference between Eliezer and UFO conspiracy theorists, a difference that should be more than noticeable to anyone with an IQ high enough to be in graduate school in mathematics. Yes, of course, if all you know about a person is that they make strange claims, then you should by default assume they’re a UFO/New Age type. But I submit that the fact that Eliezer has written things like these decisively entitles him to a pass on that particular inference, and anyone who doesn’t grant it to him just isn’t very discriminating.
My own experience is that the correlation is not very high. Most of the people who I’ve met who are as smart as me (e.g. in the sense of having high IQ) are not nearly as open as I am.
I did not intend to equate intelligence with instrumental rationality. The reason why I mentioned instrumental rationality is that ultimately what matters is to get people with high instrumental rationality (whether they’re open minded or not) interested in existential risk.
My point is that people who are closed minded should not be barred from consideration as potentially useful existential risk researchers, that although people are being irrational to dismiss Eliezer as fast as they do, that doesn’t mean that they’re holistically irrational. My own experience has been that my openness has both benefits and drawbacks.
Math grad students can see a huge difference between Eliezer and UFO conspiracy theorists—they recognize that Eliezer’s intellectually sophisticated. They’re still biased to dismiss him out of hand. See bentram’s comment
Edit: You might wonder where the bias to dismiss Eliezer comes from. I think it comes mostly from conformity, which is, sadly, very high even among very smart people.
You may be right about this; perhaps Eliezer should in fact work on his PR skills. At the same time, we shouldn’t underestimate the difficulty of “recruiting” folks who are inclined to be conformists; unless there’s a major change in the general sanity level of the population, x-risk talk is inevitably going to sound “weird”.
This is a problem; no question about it.
I agree with this. It’s all a matter of degree. Maybe at present one has to be in the top 1% of the population in nonconformity to be interested in existential risk and with better PR one could reduce the level of nonconformity required to the top 5% level.
(I don’t know whether these numbers are right, but this is the sort of thing that I have in mind—I find it very likely that there are people who are nonconformist enough to potentially be interested in existential risk but too conformist to take it seriously unless the people who are involved seem highly credible.)
I would perhaps expand ‘conformity’ to include neighbouring social factors—in-group/outgroup, personal affiliation/alliances, territorialism, etc.
One more point—though I could immediately recognize that there’s something important to some of what Eliezer says, the fact that he makes outlandish claims did make me take longer to get around to thinking seriously about existential risk. This is because of a factor that I mention in my post which I quote below.
I’m not proud that I’m so influenced, but I’m only human. I find it very plausible that there are others like me.