I’ll state my own experience and perception, since it seems to be different from that of others, as evidenced in both the post and the comments. Take it for what it’s worth; maybe it’s rare enough to be disregarded.
The first time I heard about SIAI—which was possibly the first time I had heard the word “singularity” in the technological sense—was whenever I first looked at the “About” page on Overcoming Bias, sometime in late 2006 or early 2007, where it was listed as Eliezer Yudkowsky’s employer. To make this story short, the whole reason I became interested in this topic in the first place was because I was impressed by EY—specifically his writings on rationality on OB (now known as the Sequences here on LW). Now of course most of those ideas were hardly original with him (indeed many times I had the feeling he was stating the obvious, albeit in a refreshing, enjoyable way) but the fact that he was able to write them down in such a clear, systematic, and readable fashion showed that he understood them thoroughly. This was clearly somebody who knew how to think.
Now, when someone has made that kind of demonstration of rationality, I just don’t have much problem listening to whatever they have to say, regardless of how “outlandish” it may seem in the context of most human discourse. Maybe I’m exceptional in this respect, but I’ve never been under the impression that only “normal-sounding” things can be true or important. At any rate, I’ve certainly never been under that impression to such an extent that I would be willing to dismiss claims made by the author of The Simple Truth and A Technical Explanation of a Technical Explanation, someone who understands things like the gene-centered view of evolution and why MWI exemplifies rather than violates Occam’s Razor, in the context of his own professional vocation!
I really don’t understand what the difference is between me and the “smart people” that you (and XiXiDu) know. In fact maybe they should be more inclined to listen to EY and SIAI; after all, they probably grew up reading science fiction, in households where mild existential risks like global warming were taken seriously. Are they just not as smart as me? Am I unusually susceptible to following leaders and joining cults? (Don’t think so.) Do I simply have an unusual personality that makes me willing to listen to strange-sounding claims? (But why wouldn’t they as well, if they’re “smart”?)
Why can’t they just read the darn sequences and pick up on the fact that these people are worth listening to?
I STRONGLY suspect that there is a enormous gulf between finding out things on your own and being directed to them by a peer.
When you find something on your own (existential risk, cryonics, whatever), you get to bask in your own fortuitousness, and congratulate yourself on being smart enough to understand it’s value. You get a boost in (perceived) status, because not only do you know more than you did before, you know things other people don’t know.
But when someone else has to direct you to it, it’s much less positive. When you tell someone about existential risk or cryonics or whatever, the subtext is “look, you’re weren’t able to figure this out by yourself, let me help you”. No matter how nicely you phrase it, there’s going to be resistance because it comes with a drop in status—which they can avoid by not accepting whatever you’re selling. It actually might be WORSE with smart people who believe that they have most things “figured out”.
To make this story short, the whole reason I became interested in this topic in the first place was because I was impressed by EY—specifically his writings on rationality on OB (now known as the Sequences here on LW). Now of course most of those ideas were hardly original with him (indeed many times I had the feeling he was stating the obvious, albeit in a refreshing, enjoyable way) but the fact that he was able to write them down in such a clear, systematic, and readable fashion showed that he understood them thoroughly. This was clearly somebody who knew how to think.
I know some people who have had this sort of experience. My claim is not that Eliezer has uniformly repelled people from thinking about existential risk. My claim is that on average Eliezer’s outlandish claims repel people from thinking about existential risk.
Do I simply have an unusual personality that makes me willing to listen to strange-sounding claims?
My guess would be that this is it. I’m the same way.
(But why wouldn’t they as well, if they’re “smart”?)
It’s not clear that willingness to listen to strange-sounding claims exhibits correlation with instrumental rationality, or what the sign of that correlation is. People who are willing to listen to strange-sounding claims statistically end up hanging out with UFO conspiracy theorists, New Age people, etc. more often than usual. Statistically, people who make strange-sounding claims are not worth listening to. Too much willingness to listen to strange-sounding claims can easily result in one wasting large portions of one’s life.
Why can’t they just read the darn sequences and pick up on the fact that these people are worth listening to?
(Yes, I’d encourage anyone to sign their kids up for cryonics; but not doing so is an extremely poor predictor of whether or not you treat your kids well in other ways, which is what the term should mean by any reasonable standard).
Given Eliezer’s belief about the probability of cryonics working and belief that others should understand that cryonics has a high probability of working, his statement that “If you don’t sign up your kids for cryonics then you are a lousy parent” is not just correct but trivial.
One of the reasons I so enjoy reading Less Wrong is Eliezer’s willingness to accept and announce the logical consequences of his beliefs.
There is a huge gap between “you are doing your kids a great disservice” and “you are a lousy parent”: “X is an act of a lousy parent” to me implies that it is a good predictor of other lousy parent acts.
EDIT: BTW I should make clear that I plan to try to persuade some of my friends to sign up themselves and both their kids for cryonics, so I do have skin in the game...
I’m not completely sure I disagree with that, but do you have the same attitude towards parents who try to heal treatable cancer with prayer and nothing else, but are otherwise great parents?
I think that would be a more effective predictor of other forms of lousiness: it means you’re happy to ignore the advice of scientific authority in favour of what your preacher or your own mad beliefs tell you, which can get you into trouble in lots of other ways.
That said, this is a good counter, and it does make me wonder if I’m drawing the right line. For one thing, what do you count as a single act? If you don’t get cryonics for your first child, it’s a good predictor that you won’t for your second either, so does that count? So I think another aspect of it is that to count, something has to be unusually bad. If you don’t get your kids vaccinated in the UK in 2010, that’s lousy parenting, but if absolutely everyone you ever meet thinks that vaccines are the work of the devil, then “lousy” seems too strong a term for going along with it.
If you don’t get your kids vaccinated in the UK in 2010, that’s lousy parenting, but if absolutely everyone you ever meet thinks that vaccines are the work of the devil, then “lousy” seems too strong a term for going along with it.
True. However, if absolutely everyone you ever meet thinks vaccines are evil except for one doctor and that doctor has science on his side, and you choose not to get your kids vaccinated because of “going along with” social pressures, then “lousy parent” is exactly the right strength of term. And that’s really the case here. Not absolutely everyone thinks cryonics is wrong or misguided. And if you can’t sort the bullshit and wishful thinking from the science, then you’re doing your child a disservice.
If “you” refers to a typical parent in the US, then it’s sensible (but hardly trivial). But it could easily be interpreted as referring to parents who are poor enough that they should give higher priority to buying a safer car, moving to a neighborhood with a lower crime rate, etc.
Eliezer’s writings about cryonics may help him attract more highly rational people to work with him, but will probably reduce his effectiveness at warning people working on other AGI projects of the risks. I think he has more potential to reduce existential risk via the latter approach.
Yes, this is the sort of thing that I had in mind in making my cryonics post—as I said in the revised version of my post, I have a sense that a portion of the Less Wrong community has the attitude that cryonics is “moral” in some sort of comprehensive sense.
If you believe that thousands of people die unnecessarily every single day then of course you think cryonics is a moral issue.
If people in the future come to believe that we should have know that cryonics would probably work then they might well conclude that our failure to at least offer cryonics to terminally ill children was (and yes I know what I’m about to write sounds extreme and will be off-putting to many) Nazi-level evil.
I’ve thought carefully about this matter and believe that there’s good reason to doubt your prediction. I will detail my thoughts on this matter in a later top level post.
Also, keep in mind that reading the sequences requires nontrivial effort—effort which even moderately skeptical people might be unwilling to expend. Hopefully Eliezer’s upcoming rationality book will solve some of that problem, though. After all, even if it contains largely the same content, people are generally much more willing to read one book rather than hundreds of articles.
Thank you for your thoughtful reply; although, as will be evident, I’m not quite sure I actually got the point across.
(But why wouldn’t they as well, if they’re “smart”?)
It’s not clear that willingness to listen to strange-sounding claims exhibits correlation with instrumental rationality,
I didn’t realize at all that by “smart” you meant “instrumentally rational”; I was thinking rather more literally in terms of IQ. And I would indeed expect IQ to correlate positively with what you might call openness. More precisely, although I would expect openness to be only weak evidence of high IQ, I would expect high IQ to be more significant evidence of openness.
People who are willing to listen to strange-sounding claims statistically end up hanging out with UFO conspiracy theorists, New Age people, etc...
Why can’t they just read the darn sequences and pick up on the fact that these people are worth listening to?
See my remarks above.
The point of my comment was that reading his writings reveals a huge difference between Eliezer and UFO conspiracy theorists, a difference that should be more than noticeable to anyone with an IQ high enough to be in graduate school in mathematics. Yes, of course, if all you know about a person is that they make strange claims, then you should by default assume they’re a UFO/New Age type. But I submit that the fact that Eliezer has written thingslikethese decisively entitles him to a pass on that particular inference, and anyone who doesn’t grant it to him just isn’t very discriminating.
And I would indeed expect IQ to correlate positively with what you might call openness.
My own experience is that the correlation is not very high. Most of the people who I’ve met who are as smart as me (e.g. in the sense of having high IQ) are not nearly as open as I am.
I didn’t realize at all that by “smart” you meant “instrumentally rational”;
I did not intend to equate intelligence with instrumental rationality. The reason why I mentioned instrumental rationality is that ultimately what matters is to get people with high instrumental rationality (whether they’re open minded or not) interested in existential risk.
My point is that people who are closed minded should not be barred from consideration as potentially useful existential risk researchers, that although people are being irrational to dismiss Eliezer as fast as they do, that doesn’t mean that they’re holistically irrational. My own experience has been that my openness has both benefits and drawbacks.
The point of my comment was that reading his writings reveals a huge difference between Eliezer and UFO conspiracy theorists, a difference that should be more than noticeable to anyone with an IQ high enough to be in graduate school in mathematics.
Math grad students can see a huge difference between Eliezer and UFO conspiracy theorists—they recognize that Eliezer’s intellectually sophisticated. They’re still biased to dismiss him out of hand. See bentram’s comment
Edit: You might wonder where the bias to dismiss Eliezer comes from. I think it comes mostly from conformity, which is, sadly, very high even among very smart people.
My point is that people who are closed minded should not be barred from consideration as potentially useful existential risk researchers
You may be right about this; perhaps Eliezer should in fact work on his PR skills. At the same time, we shouldn’t underestimate the difficulty of “recruiting” folks who are inclined to be conformists; unless there’s a major change in the general sanity level of the population, x-risk talk is inevitably going to sound “weird”.
Math grad students can see a huge difference between Eliezer and UFO conspiracy theorists—they recognize that Eliezer’s intellectually sophisticated. They’re still biased to dismiss him out of hand
At the same time we shouldn’t underestimate the difficulty of “recruiting” folks who are inclined to be conformists; unless there’s a major change in the general sanity level of the population, x-risk talk is inevitably going to sound “weird”.
I agree with this. It’s all a matter of degree. Maybe at present one has to be in the top 1% of the population in nonconformity to be interested in existential risk and with better PR one could reduce the level of nonconformity required to the top 5% level.
(I don’t know whether these numbers are right, but this is the sort of thing that I have in mind—I find it very likely that there are people who are nonconformist enough to potentially be interested in existential risk but too conformist to take it seriously unless the people who are involved seem highly credible.)
Edit: You might wonder where the bias to dismiss Eliezer comes from. I think it comes mostly from conformity, which is, sadly, very high even among very smart people.
I would perhaps expand ‘conformity’ to include neighbouring social factors—in-group/outgroup, personal affiliation/alliances, territorialism, etc.
One more point—though I could immediately recognize that there’s something important to some of what Eliezer says, the fact that he makes outlandish claims did make me take longer to get around to thinking seriously about existential risk. This is because of a factor that I mention in my post which I quote below.
There is also a social effect which compounds the issue which I just mentioned. The issue which I just mentioned makes people who are not directly influenced by the issue that I just mentioned less likely to think seriously about existential risk on account of their desire to avoid being perceived as associated with claims that people find uncredible.
I’m not proud that I’m so influenced, but I’m only human. I find it very plausible that there are others like me.
I’ll state my own experience and perception, since it seems to be different from that of others, as evidenced in both the post and the comments. Take it for what it’s worth; maybe it’s rare enough to be disregarded.
The first time I heard about SIAI—which was possibly the first time I had heard the word “singularity” in the technological sense—was whenever I first looked at the “About” page on Overcoming Bias, sometime in late 2006 or early 2007, where it was listed as Eliezer Yudkowsky’s employer. To make this story short, the whole reason I became interested in this topic in the first place was because I was impressed by EY—specifically his writings on rationality on OB (now known as the Sequences here on LW). Now of course most of those ideas were hardly original with him (indeed many times I had the feeling he was stating the obvious, albeit in a refreshing, enjoyable way) but the fact that he was able to write them down in such a clear, systematic, and readable fashion showed that he understood them thoroughly. This was clearly somebody who knew how to think.
Now, when someone has made that kind of demonstration of rationality, I just don’t have much problem listening to whatever they have to say, regardless of how “outlandish” it may seem in the context of most human discourse. Maybe I’m exceptional in this respect, but I’ve never been under the impression that only “normal-sounding” things can be true or important. At any rate, I’ve certainly never been under that impression to such an extent that I would be willing to dismiss claims made by the author of The Simple Truth and A Technical Explanation of a Technical Explanation, someone who understands things like the gene-centered view of evolution and why MWI exemplifies rather than violates Occam’s Razor, in the context of his own professional vocation!
I really don’t understand what the difference is between me and the “smart people” that you (and XiXiDu) know. In fact maybe they should be more inclined to listen to EY and SIAI; after all, they probably grew up reading science fiction, in households where mild existential risks like global warming were taken seriously. Are they just not as smart as me? Am I unusually susceptible to following leaders and joining cults? (Don’t think so.) Do I simply have an unusual personality that makes me willing to listen to strange-sounding claims? (But why wouldn’t they as well, if they’re “smart”?)
Why can’t they just read the darn sequences and pick up on the fact that these people are worth listening to?
I STRONGLY suspect that there is a enormous gulf between finding out things on your own and being directed to them by a peer.
When you find something on your own (existential risk, cryonics, whatever), you get to bask in your own fortuitousness, and congratulate yourself on being smart enough to understand it’s value. You get a boost in (perceived) status, because not only do you know more than you did before, you know things other people don’t know.
But when someone else has to direct you to it, it’s much less positive. When you tell someone about existential risk or cryonics or whatever, the subtext is “look, you’re weren’t able to figure this out by yourself, let me help you”. No matter how nicely you phrase it, there’s going to be resistance because it comes with a drop in status—which they can avoid by not accepting whatever you’re selling. It actually might be WORSE with smart people who believe that they have most things “figured out”.
Thanks for your thoughtful comment.
I know some people who have had this sort of experience. My claim is not that Eliezer has uniformly repelled people from thinking about existential risk. My claim is that on average Eliezer’s outlandish claims repel people from thinking about existential risk.
My guess would be that this is it. I’m the same way.
It’s not clear that willingness to listen to strange-sounding claims exhibits correlation with instrumental rationality, or what the sign of that correlation is. People who are willing to listen to strange-sounding claims statistically end up hanging out with UFO conspiracy theorists, New Age people, etc. more often than usual. Statistically, people who make strange-sounding claims are not worth listening to. Too much willingness to listen to strange-sounding claims can easily result in one wasting large portions of one’s life.
See my remarks above.
For my part, I keep wondering how long it’s going to be before someone throws his “If you don’t sign up your kids for cryonics then you are a lousy parent” remark at me, to which I will only be able to say that even he says stupid things sometimes.
(Yes, I’d encourage anyone to sign their kids up for cryonics; but not doing so is an extremely poor predictor of whether or not you treat your kids well in other ways, which is what the term should mean by any reasonable standard).
Given Eliezer’s belief about the probability of cryonics working and belief that others should understand that cryonics has a high probability of working, his statement that “If you don’t sign up your kids for cryonics then you are a lousy parent” is not just correct but trivial.
One of the reasons I so enjoy reading Less Wrong is Eliezer’s willingness to accept and announce the logical consequences of his beliefs.
There is a huge gap between “you are doing your kids a great disservice” and “you are a lousy parent”: “X is an act of a lousy parent” to me implies that it is a good predictor of other lousy parent acts.
EDIT: BTW I should make clear that I plan to try to persuade some of my friends to sign up themselves and both their kids for cryonics, so I do have skin in the game...
I’m not completely sure I disagree with that, but do you have the same attitude towards parents who try to heal treatable cancer with prayer and nothing else, but are otherwise great parents?
I think that would be a more effective predictor of other forms of lousiness: it means you’re happy to ignore the advice of scientific authority in favour of what your preacher or your own mad beliefs tell you, which can get you into trouble in lots of other ways.
That said, this is a good counter, and it does make me wonder if I’m drawing the right line. For one thing, what do you count as a single act? If you don’t get cryonics for your first child, it’s a good predictor that you won’t for your second either, so does that count? So I think another aspect of it is that to count, something has to be unusually bad. If you don’t get your kids vaccinated in the UK in 2010, that’s lousy parenting, but if absolutely everyone you ever meet thinks that vaccines are the work of the devil, then “lousy” seems too strong a term for going along with it.
True. However, if absolutely everyone you ever meet thinks vaccines are evil except for one doctor and that doctor has science on his side, and you choose not to get your kids vaccinated because of “going along with” social pressures, then “lousy parent” is exactly the right strength of term. And that’s really the case here. Not absolutely everyone thinks cryonics is wrong or misguided. And if you can’t sort the bullshit and wishful thinking from the science, then you’re doing your child a disservice.
If “you” refers to a typical parent in the US, then it’s sensible (but hardly trivial). But it could easily be interpreted as referring to parents who are poor enough that they should give higher priority to buying a safer car, moving to a neighborhood with a lower crime rate, etc.
Eliezer’s writings about cryonics may help him attract more highly rational people to work with him, but will probably reduce his effectiveness at warning people working on other AGI projects of the risks. I think he has more potential to reduce existential risk via the latter approach.
Yes, this is the sort of thing that I had in mind in making my cryonics post—as I said in the revised version of my post, I have a sense that a portion of the Less Wrong community has the attitude that cryonics is “moral” in some sort of comprehensive sense.
If you believe that thousands of people die unnecessarily every single day then of course you think cryonics is a moral issue.
If people in the future come to believe that we should have know that cryonics would probably work then they might well conclude that our failure to at least offer cryonics to terminally ill children was (and yes I know what I’m about to write sounds extreme and will be off-putting to many) Nazi-level evil.
I’ve thought carefully about this matter and believe that there’s good reason to doubt your prediction. I will detail my thoughts on this matter in a later top level post.
I would like the opportunity to make timely comments on such a post, but I will be traveling until Aug 27th and so request you don’t post before then.
Sure, sounds good.
Also, keep in mind that reading the sequences requires nontrivial effort—effort which even moderately skeptical people might be unwilling to expend. Hopefully Eliezer’s upcoming rationality book will solve some of that problem, though. After all, even if it contains largely the same content, people are generally much more willing to read one book rather than hundreds of articles.
Thank you for your thoughtful reply; although, as will be evident, I’m not quite sure I actually got the point across.
I didn’t realize at all that by “smart” you meant “instrumentally rational”; I was thinking rather more literally in terms of IQ. And I would indeed expect IQ to correlate positively with what you might call openness. More precisely, although I would expect openness to be only weak evidence of high IQ, I would expect high IQ to be more significant evidence of openness.
The point of my comment was that reading his writings reveals a huge difference between Eliezer and UFO conspiracy theorists, a difference that should be more than noticeable to anyone with an IQ high enough to be in graduate school in mathematics. Yes, of course, if all you know about a person is that they make strange claims, then you should by default assume they’re a UFO/New Age type. But I submit that the fact that Eliezer has written things like these decisively entitles him to a pass on that particular inference, and anyone who doesn’t grant it to him just isn’t very discriminating.
My own experience is that the correlation is not very high. Most of the people who I’ve met who are as smart as me (e.g. in the sense of having high IQ) are not nearly as open as I am.
I did not intend to equate intelligence with instrumental rationality. The reason why I mentioned instrumental rationality is that ultimately what matters is to get people with high instrumental rationality (whether they’re open minded or not) interested in existential risk.
My point is that people who are closed minded should not be barred from consideration as potentially useful existential risk researchers, that although people are being irrational to dismiss Eliezer as fast as they do, that doesn’t mean that they’re holistically irrational. My own experience has been that my openness has both benefits and drawbacks.
Math grad students can see a huge difference between Eliezer and UFO conspiracy theorists—they recognize that Eliezer’s intellectually sophisticated. They’re still biased to dismiss him out of hand. See bentram’s comment
Edit: You might wonder where the bias to dismiss Eliezer comes from. I think it comes mostly from conformity, which is, sadly, very high even among very smart people.
You may be right about this; perhaps Eliezer should in fact work on his PR skills. At the same time, we shouldn’t underestimate the difficulty of “recruiting” folks who are inclined to be conformists; unless there’s a major change in the general sanity level of the population, x-risk talk is inevitably going to sound “weird”.
This is a problem; no question about it.
I agree with this. It’s all a matter of degree. Maybe at present one has to be in the top 1% of the population in nonconformity to be interested in existential risk and with better PR one could reduce the level of nonconformity required to the top 5% level.
(I don’t know whether these numbers are right, but this is the sort of thing that I have in mind—I find it very likely that there are people who are nonconformist enough to potentially be interested in existential risk but too conformist to take it seriously unless the people who are involved seem highly credible.)
I would perhaps expand ‘conformity’ to include neighbouring social factors—in-group/outgroup, personal affiliation/alliances, territorialism, etc.
One more point—though I could immediately recognize that there’s something important to some of what Eliezer says, the fact that he makes outlandish claims did make me take longer to get around to thinking seriously about existential risk. This is because of a factor that I mention in my post which I quote below.
I’m not proud that I’m so influenced, but I’m only human. I find it very plausible that there are others like me.