Academia in general is certainly not an adequate community from an epistemic standards point of view, and while small pockets are relatively healthy, none are great. And yes, the various threads of epistemic rationality certainly predated LessWrong, and there were people and even small groups that noted the theoretical importance of pursuing it, but I don’t there were places that actively advocated that members follow those epistemic standards.
To get back to the main point, I don’t think that it is necessary for the community to “fulfill each other’s needs for companionship, friendship, etc,” I don’t think that there is a good way to reinforce norms without something at least as strongly affiliated as a club. There is a fine line between club and community, and I understand why people feel there are dangers of going too far, but before LW, few groups seem to have gone nearly far enough in building even a project group with those norms.
Mine, and my experience working in academia. But (with the very unusual exceptions of FHI, GMU’s economics department, and possibly the new center at Georgetown) I don’t think you’d find much disagreement among LWers who interact with academics that academia sometimes fails to do even the obvious, level-one intelligent character things to enable them to achieve their goals.
I think your comment is unnecessarily hedged—do you think that you’d find much disagreement among LWers who interact with FHI/GMU-Econ over whether people there sometimes (vs never) fail to do level-one things?
I think I understand the connotation of your statement, but it’d be easier to understand if you strengthened “sometimes” to a stronger statement about academia’s inadequacy. Certainly the rationality community also sometimes fails to do even the obvious, level-one intelligent character things to enable them to achieve their goals—what is the actual claim that distinguishes the communities?
That’s a very good point, I was definitely unclear.
I think that the critical difference is that in epistemically health communities, when such a failure is pointed out, some effort is spent in identifying and fixing the problem, instead of pointedly ignoring it despite efforts to solve the problem, or spending time actively defending the inadequate status quo from even pareto-improving changes.
Oh, I so your complaint us about instrumental rationality. Well, naturally they’re bad at that. Most people are. You don’t get good at doing things by studying rationality in the abstract. EY couldn’t succeed in spending $300k of free money on producing software to his exact specifications.
I was thinking more of epistemic rationality, having given up on instrumental rationality.
I don’t think they get epistemic rationality anywhere near correct either. As a clear and simpole example, there are academics currently vigorously defending their right not to pre-register empirical studies.
Academia in general is certainly not an adequate community from an epistemic standards point of view, and while small pockets are relatively healthy, none are great. And yes, the various threads of epistemic rationality certainly predated LessWrong, and there were people and even small groups that noted the theoretical importance of pursuing it, but I don’t there were places that actively advocated that members follow those epistemic standards.
To get back to the main point, I don’t think that it is necessary for the community to “fulfill each other’s needs for companionship, friendship, etc,” I don’t think that there is a good way to reinforce norms without something at least as strongly affiliated as a club. There is a fine line between club and community, and I understand why people feel there are dangers of going too far, but before LW, few groups seem to have gone nearly far enough in building even a project group with those norms.
By whose epistemic standards? And what’s the evidence for the claim?
Mine, and my experience working in academia. But (with the very unusual exceptions of FHI, GMU’s economics department, and possibly the new center at Georgetown) I don’t think you’d find much disagreement among LWers who interact with academics that academia sometimes fails to do even the obvious, level-one intelligent character things to enable them to achieve their goals.
I think your comment is unnecessarily hedged—do you think that you’d find much disagreement among LWers who interact with FHI/GMU-Econ over whether people there sometimes (vs never) fail to do level-one things?
I think I understand the connotation of your statement, but it’d be easier to understand if you strengthened “sometimes” to a stronger statement about academia’s inadequacy. Certainly the rationality community also sometimes fails to do even the obvious, level-one intelligent character things to enable them to achieve their goals—what is the actual claim that distinguishes the communities?
That’s a very good point, I was definitely unclear.
I think that the critical difference is that in epistemically health communities, when such a failure is pointed out, some effort is spent in identifying and fixing the problem, instead of pointedly ignoring it despite efforts to solve the problem, or spending time actively defending the inadequate status quo from even pareto-improving changes.
Oh, I so your complaint us about instrumental rationality. Well, naturally they’re bad at that. Most people are. You don’t get good at doing things by studying rationality in the abstract. EY couldn’t succeed in spending $300k of free money on producing software to his exact specifications.
I was thinking more of epistemic rationality, having given up on instrumental rationality.
I don’t think they get epistemic rationality anywhere near correct either. As a clear and simpole example, there are academics currently vigorously defending their right not to pre-register empirical studies.
And even of those who do preregister, noboby puts down their credence for the likilihood that there’s an effect.