If Eliezer showed strong ability to attract and work well with collaborators (including elite academics who are working on artificial intelligence research) then I would find it several orders of magnitude more likely that he would make a crucial contribution to an FAI research project. For concreteness I’ll throw out the number 10^(-6).
This, along with your other estimate of 10^(-9), implies that your probability for Eliezer being able to eventually attract and work well with collaborators is currently 1/1000. Does that really seem reasonable to you (would you be willing to bet at those odds?), given other evidence besides your private exchange with Eliezer? Such as:
Eliezer already had a close collaborator, namely Marcello
SIAI has successfully attracted many visiting fellows
SIAI has successfully attracted top academics to speak at their Singularity Summit
Eliezer is currently writing a book on rationality, so presumably he isn’t actively trying to recruit collaborators at the moment
Other people’s reports of not finding Eliezer particularly difficult to work with
It seems to me that rationally updating on Eliezer’s private comments couldn’t have resulted in such a low probability. So I think a more likely explanation is that you were offended by the implications of Eliezer’s dismissive attitude towards your comments.
(Although, given Eliezer’s situation, it would probably be a good idea for him to make a greater effort to avoid offending potential supporters, even if he doesn’t consider them to be viable future collaborators.)
My subjective perception is that I started out thinking quite carefully and became less rational as I read and responded to hostile commentators.
Your responses to me seem pretty level headed and sober. I hope that means you don’t find my comments too hostile.
This, along with your other estimate of 10^(-9), implies that your probability for Eliezer being able to eventually attract and work well with collaborators is currently 1/1000. Does that really seem reasonable to you (would you be willing to bet at those odds?)
Thinking it over, my estimate of 10^(-6) was way too high. This isn’t because of a lack of faith in Eliezer’s abilities in particular. I would recur to my above remark that I think that everybody has very small probability of succeeding in efforts to eliminate existential risk. We’re part of a complicated chaotic dynamical system and to a large degree our cumulative impact on the world is unintelligible and unexpected (because of a complicated network of unintended consequences, side effects, side effects of the side effects, etc.).
Your responses to me seem pretty level headed and sober. I hope that means you don’t find my comments too hostile.
This, along with your other estimate of 10^(-9), implies that your probability for Eliezer being able to eventually attract and work well with collaborators is currently 1/1000. Does that really seem reasonable to you (would you be willing to bet at those odds?), given other evidence besides your private exchange with Eliezer? Such as:
Eliezer already had a close collaborator, namely Marcello
SIAI has successfully attracted many visiting fellows
SIAI has successfully attracted top academics to speak at their Singularity Summit
Eliezer is currently writing a book on rationality, so presumably he isn’t actively trying to recruit collaborators at the moment
Other people’s reports of not finding Eliezer particularly difficult to work with
It seems to me that rationally updating on Eliezer’s private comments couldn’t have resulted in such a low probability. So I think a more likely explanation is that you were offended by the implications of Eliezer’s dismissive attitude towards your comments.
(Although, given Eliezer’s situation, it would probably be a good idea for him to make a greater effort to avoid offending potential supporters, even if he doesn’t consider them to be viable future collaborators.)
Your responses to me seem pretty level headed and sober. I hope that means you don’t find my comments too hostile.
Thinking it over, my estimate of 10^(-6) was way too high. This isn’t because of a lack of faith in Eliezer’s abilities in particular. I would recur to my above remark that I think that everybody has very small probability of succeeding in efforts to eliminate existential risk. We’re part of a complicated chaotic dynamical system and to a large degree our cumulative impact on the world is unintelligible and unexpected (because of a complicated network of unintended consequences, side effects, side effects of the side effects, etc.).
Glad to hear it :-)