If AI in general or uFAI in paticular becomes a politicized issue (not quite identical to “controversial”) to the extent that AGW now is, I suspect it’ll be grandfathered in here by the same mechanism that religion now is; it’s too near and dear a topic to too many critical community members for it to ever be entirely dismissed. However, its relative prominence might go down a notch or two—moves to promote this may already be happening, given the Center for Modern Rationality’s upcoming differentiation from SIAI.
As to applied rationality and AGW: I view agreement with the mainstream climatology position as weak but positive evidence of sanity. However, I don’t view it as particularly significant to the LW mission in a direct sense, and I think taking a vocal position on the subject would likely lower the sanity waterline by way of scaring off ideologically biased folks who might be convinced to become less ideologically biased by a consciously nonpartisan approach. There’s a lot more to lose here, rationality-wise, than there is to gain.
Well, that’s too bad then. I came to post there after reading the Eliezer posts on the many worlds interpretation, where he tried to debunk the SI (now that really polarizes people politically, even though its not linked to politics. Trying to debunk a well established method that works). He is somewhat sloppy at quantum mechanics, and makes some technical errors, but it is very good content nonetheless that I really enjoyed. I don’t enjoy the meta-meta so much.
If AI in general or uFAI in paticular becomes a politicized issue (not quite identical to “controversial”) to the extent that AGW now is, I suspect it’ll be grandfathered in here by the same mechanism that religion now is; it’s too near and dear a topic to too many critical community members for it to ever be entirely dismissed. However, its relative prominence might go down a notch or two—moves to promote this may already be happening, given the Center for Modern Rationality’s upcoming differentiation from SIAI.
As to applied rationality and AGW: I view agreement with the mainstream climatology position as weak but positive evidence of sanity. However, I don’t view it as particularly significant to the LW mission in a direct sense, and I think taking a vocal position on the subject would likely lower the sanity waterline by way of scaring off ideologically biased folks who might be convinced to become less ideologically biased by a consciously nonpartisan approach. There’s a lot more to lose here, rationality-wise, than there is to gain.
Well, that’s too bad then. I came to post there after reading the Eliezer posts on the many worlds interpretation, where he tried to debunk the SI (now that really polarizes people politically, even though its not linked to politics. Trying to debunk a well established method that works). He is somewhat sloppy at quantum mechanics, and makes some technical errors, but it is very good content nonetheless that I really enjoyed. I don’t enjoy the meta-meta so much.