I first came across Overcoming Bias in 2008. Eliezer was recommended to me by a friend who I respect a great deal. My reactions to the first postings that I read by Eliezer was strong discomfort with his apparent grandiosity and self absorption. This discomfort was sufficiently strong for me to lose interest despite my friend’s endorsement.
I’d be really interested to know which posts these were, because it would help me to distinguish between the following interpretations:
(1) First impressions really do matter: even though you and I are probably very similar in many respects, we have different opinions of Eliezer simply because in the first posts of his I read, he sounded more like a yoga instructor than a cult leader; whereas perhaps the first thing you read was some post where his high estimation of his abilities relative to the rest of humanity was made explicit, and you didn’t have the experience of his other writings to allow you to “forgive” him for this social transgression.
(2) We have different personalities, which cause us to interpret people’s words differently: you and I read more or less the same kind of material first, but you just interpreted it as “grandiose” whereas I didn’t.
What’s interesting in any case is that I’m not sure that I actually disagree with you all that much about Eliezer having a small chance of success (though I think you quantify it incorrectly with numbers like 10^(-9) or 10^(-6) -- these are way too small). Where we differ seems to be in the implications we draw from this. You appear to believe that Eliezer and SIAI are doing something importantly wrong, that could be fixed by means of a simple change of mindset, and that they shouldn’t be supported until they make this change. By contrast, my interpretation is that this is an extremely difficult problem, that SIAI is basically the first organization that has begun to make a serious attempt to address it, and that they are therefore worthy of being supported so that they can increase their efforts in the directions they are currently pursuing and potentially have a larger impact than they otherwise would.
I’ve been meaning to ask you: given your interest in reducing existential risk, and your concerns about SIAI’s transparency and their general strategy, have you considered applying to the Visiting Fellows program? That would be an excellent way not only to see what it is they do up close, but also to discuss these very issues in person at length with the people involved in SIAI strategy—which, in my experience, they are very interested in doing, even with short-term visitors.
I’d be really interested to know which posts these were, because it would help me to distinguish between the following interpretations:
Right, so the first posts that I came across were Eliezer’s Coming of Age posts which I think are unrepresentatively self absorbed. So I think that the right interpretation is the first that you suggest.
What’s interesting in any case is that I’m not sure that I actually disagree with you all that much about Eliezer having a small chance of success (though I think you quantify it incorrectly with numbers like 10^(-9) or 10^(-6) -- these are way too small). Where we differ seems to be in the implications we draw from this. You appear to believe that Eliezer and SIAI are doing something importantly wrong, that could be fixed by means of a simple change of mindset, and that they shouldn’t be supported until they make this change. By contrast, my interpretation is that this is an extremely difficult problem, that SIAI is basically the first organization that has begun to make a serious attempt to address it, and that they are therefore worthy of being supported so that they can increase their efforts in the directions they are currently pursuing and potentially have a larger impact than they otherwise would.
Since I made my top level posts, I’ve been corresponding with Carl Shulman who informed me of some good things that SIAI has been doing that have altered my perception of the institution. I think that SIAI may be worthy of funding.
Regardless as to the merits of SIAI’s research and activities, I think that in general it’s valuable to promote norms of Transparency and Accountability. I would certainly be willing to fund SIAI if it were strongly recommended by a highly credible external charity evaluator like GiveWell. Note also a comment which I wrote in response to Jasen.
I would like to talk more about these things—would you like to share email addresses? PM me if so.
I’ve been meaning to ask you: given your interest in reducing existential risk, and your concerns about SIAI’s transparency and their general strategy, have you considered applying to the Visiting Fellows program? That would be an excellent way not only to see what it is they do up close, but also to discuss these very issues in person at length with the people involved in SIAI strategy—which, in my experience, they are very interested in doing, even with short-term visitors.
At this point I worry that I’ve alienated the SIAI people to such an extent that they might not be happy to have me. But I’d certainly be willing if they’re favorably disposed toward me.
I’ll remark that back in December after reading Anna Salamon’s posting on the SIAI Visting Fellows program I did send Anna Salamon a long email expressing some degree of interest and describing some my concerns without receiving a response. I now find it most plausible that she just forgot about it and that I should have tried again, but maybe you can understand from this how I got the impression that becoming an SIAI Visiting Fellow was not a strong option for me.
I would like to talk more about these things—would you like to share email addresses? PM me if so.
Done.
I’ll remark that back in December after reading Anna Salamon’s posting on the SIAI Visting Fellows program I did send Anna Salamon a long email expressing some degree of interest and describing some my concerns without receiving a response. I now find it most plausible that she just forgot about it and that I should have tried again, but maybe you can understand from this how I got the impression that becoming an SIAI Visiting Fellow was not a strong option for me.
As it happens, the same thing happened to me; it turned out that my initial message had been caught in a spam filter. I eventually ended up visiting for two weeks, and highly recommend the experience.
I’d be really interested to know which posts these were, because it would help me to distinguish between the following interpretations:
(1) First impressions really do matter: even though you and I are probably very similar in many respects, we have different opinions of Eliezer simply because in the first posts of his I read, he sounded more like a yoga instructor than a cult leader; whereas perhaps the first thing you read was some post where his high estimation of his abilities relative to the rest of humanity was made explicit, and you didn’t have the experience of his other writings to allow you to “forgive” him for this social transgression.
(2) We have different personalities, which cause us to interpret people’s words differently: you and I read more or less the same kind of material first, but you just interpreted it as “grandiose” whereas I didn’t.
What’s interesting in any case is that I’m not sure that I actually disagree with you all that much about Eliezer having a small chance of success (though I think you quantify it incorrectly with numbers like 10^(-9) or 10^(-6) -- these are way too small). Where we differ seems to be in the implications we draw from this. You appear to believe that Eliezer and SIAI are doing something importantly wrong, that could be fixed by means of a simple change of mindset, and that they shouldn’t be supported until they make this change. By contrast, my interpretation is that this is an extremely difficult problem, that SIAI is basically the first organization that has begun to make a serious attempt to address it, and that they are therefore worthy of being supported so that they can increase their efforts in the directions they are currently pursuing and potentially have a larger impact than they otherwise would.
I’ve been meaning to ask you: given your interest in reducing existential risk, and your concerns about SIAI’s transparency and their general strategy, have you considered applying to the Visiting Fellows program? That would be an excellent way not only to see what it is they do up close, but also to discuss these very issues in person at length with the people involved in SIAI strategy—which, in my experience, they are very interested in doing, even with short-term visitors.
Right, so the first posts that I came across were Eliezer’s Coming of Age posts which I think are unrepresentatively self absorbed. So I think that the right interpretation is the first that you suggest.
Since I made my top level posts, I’ve been corresponding with Carl Shulman who informed me of some good things that SIAI has been doing that have altered my perception of the institution. I think that SIAI may be worthy of funding.
Regardless as to the merits of SIAI’s research and activities, I think that in general it’s valuable to promote norms of Transparency and Accountability. I would certainly be willing to fund SIAI if it were strongly recommended by a highly credible external charity evaluator like GiveWell. Note also a comment which I wrote in response to Jasen.
I would like to talk more about these things—would you like to share email addresses? PM me if so.
At this point I worry that I’ve alienated the SIAI people to such an extent that they might not be happy to have me. But I’d certainly be willing if they’re favorably disposed toward me.
I’ll remark that back in December after reading Anna Salamon’s posting on the SIAI Visting Fellows program I did send Anna Salamon a long email expressing some degree of interest and describing some my concerns without receiving a response. I now find it most plausible that she just forgot about it and that I should have tried again, but maybe you can understand from this how I got the impression that becoming an SIAI Visiting Fellow was not a strong option for me.
Done.
As it happens, the same thing happened to me; it turned out that my initial message had been caught in a spam filter. I eventually ended up visiting for two weeks, and highly recommend the experience.