Epistemic status: Do not defer to me. I’m here to provide interesting arguments and patterns that may help enlighten our understanding of things. I’m not optimising my conclusions for being safe to defer to. (It’s the difference between minimising false-positives vs minimising false-negatives.)
A high bar for adopting new jargon does not conflict with a low bar for suggesting jargon. In fact, I think if more jargon is suggested, I expect a lower proportion of them to be adopted, and also that the average quality of winning jargon goes up.
There’s a speed/scope tradeoff here similar to the Zollman effect. If what you care about is that your subculture advances in idea space asap, then adopting words faster could be good. If instead it matters that the culture be accessible to a greater number of people, then a strong prior against jargon seems better. I care more about the progress of the subculture than I do about its breadth, at least on the current margin as I see it.
Point 2 above assumes for the sake of argument that you are correct about accessibility dropping the more jargon there is. But I don’t think the case for that is very strong. I think well-placed jargon makes ideas and whole paradigms a lot easier to learn, and therefore more accessible. Furthermore, I think it’s worth making a distinction between community-accessibility and idea-accessibility. A lot of jargon makes it harder to participate in the community without a greater understanding of the ideas, but they also makes it easier to understand those ideas in the first place. The net effect is probably that it creates a clearer separation between people inside and outside the community, with fewer gradients in between.
Re your concern about epistemic closure (thanks for the jargon btw!)[1]: I think if the community had more widespread healthier epistemic norms, we would be more effective at adopting new ideas and reason rationally about outside perspectives, and be more willing to change our ways en masse if that outside perspective is actually better. The rationality community is qualitatively different than other communities because prestige is to a large extent determined by one’s ability to avoid epistemic failure modes, such that the usual “cult warning signs” apply to a lesser degree.
Saying heretical stuff here, I know, but I did disclaim deferral status in the first sentence, so should be safe : )
Final point: bureaucratic acronyms are just way terribler than Internet-slang acronyms, tho! :p
This is not meant to be a critique, I just found the irony a little funny. I appreciate your comment, and learning about epistemic closure from the link.
If what you care about is that your subculture advances in idea space asap, then adopting words faster could be good. If instead it matters that the culture be accessible to a greater number of people, then a strong prior against jargon seems better. I care more about the progress of the subculture than I do about its breadth, at least on the current margin as I see it.
If there were any evidence of that, I would be far more willing to tolerate the increased coinage of acronyms and jargon. But, in my experience, the increased coinage of acronyms and jargon is a sign of the opposite occurring. It’s a sign of ossification and an inward turn as practitioners of the field become more concerned with talking with each other and forming status hierarchies than with actually advancing the field.
I think well-placed jargon makes ideas and whole paradigms a lot easier to learn, and therefore more accessible.
The phrase “well-placed” is doing a lot of work here. The natural incentives around jargon creation do not favor making jargon that simplifies concepts and makes them easier for outsiders to learn. The natural incentives around jargon creation favor making concepts more opaque, such that demonstrating mastery of the jargon serves as a form of status marker, distinguishing insiders from pretenders.
The rationality community is qualitatively different than other communities
It isn’t. The rationality community is a community of smart, literate, eloquent people who are working to understand and improve both individual and collective decision-making, especially as it relates to artificial intelligence and related technologies. Qualitatively, we’re not different from other groups of smart, literate, eloquent people who are grappling with other difficult ideas. I don’t see much of a difference, for example, between the rationality community and the foreign policy community.
prestige is to a large extent determined by one’s ability to avoid epistemic failure modes, such that the usual “cult warning signs” apply to a lesser degree.
Absolutely false. The usual cult warning signs apply (and should apply) with exactly the same severity to the rationality community as they do to any other. And when they don’t, we end up with things like Leverage Research, which are, at best ineffectual and at worst, harmful to participants.
Thank you for this discussion btw, this is helpfwl. I suspect it’s hitting diminishing returns unless we hone in on practical specifics.
I think our levels of faith in the rationality community is a crux. Here’s what I think, although I would stress again that although I tentatively believe what I say, I am not trying to be safe to defer to. Thus, I omit disclaimers and caveats and instead try to provide perspectives for evaluation. I think this warning is especially prudent here.
We have a really strong jargon hit-rate
The “natural incentives around jargon creation” in most communities favour usefwlness much less compared to this community. I can think of some examples of historically bad jargon:
“Politics is the mind killer” (not irredeemably bad, but net bad nonetheless imo)
“Bayesian”
Not confident here, but I think the term expanded too far from its roots, plus overemphasised. This could be prevented either by an increased willingness to use new terms for neighbouring semantic space, or an increased unwillingness to expand the use shibboleths for new things.
“NPC” (non-player character)
Not irredeemable, but questionable net value.
Probably more here but I can’t recall.
I think our hit-rate so far on jargon has been remarkably good. Even under the assumption that increased coinage reduces accuracy (which I weakly disagree with), it seems on the margin plausible that it will take us closer to the pareto frontier.
I am less worried about becoming marginally more insular
Our collective project is exceedingly dangerous. We’re deactivating our memetic immune system and fumbling towards deliberate epistemic practices that we hope can make up for it. I think rationality education must consist of lowering intuitive defenses in tandem with growing epistemological awareness. And in cases where this education is out of sync, it produces victims.
But I’d be wary of updating too much on Leverage as an indictment of rationality culture in general. That kind of defensiveness is the same mechanism by which hospitals get bureaucrified—they’re minimising false-positives at the cost of everything else.
I suspect that our community’s cultural inclination against these failure modes makes it more likely that our epistemic norms weaken with widespread social integration with other cultures.
I also think, more generally, that norms/advice that were necessary early on, could nowadays actively be hampering our progress. “Be less sure of yourself, seek wisdom from outside sources, etc.” is necessary advice for someone just starting out on the path, but at some point your wisdom so far exceeds outside sources that the advice hits diminishing returns—tune yourself to where you sniff out value of information, whether that be insular or not.
Epistemic status: Do not defer to me. I’m here to provide interesting arguments and patterns that may help enlighten our understanding of things. I’m not optimising my conclusions for being safe to defer to. (It’s the difference between minimising false-positives vs minimising false-negatives.)
A high bar for adopting new jargon does not conflict with a low bar for suggesting jargon. In fact, I think if more jargon is suggested, I expect a lower proportion of them to be adopted, and also that the average quality of winning jargon goes up.
There’s a speed/scope tradeoff here similar to the Zollman effect. If what you care about is that your subculture advances in idea space asap, then adopting words faster could be good. If instead it matters that the culture be accessible to a greater number of people, then a strong prior against jargon seems better. I care more about the progress of the subculture than I do about its breadth, at least on the current margin as I see it.
Point 2 above assumes for the sake of argument that you are correct about accessibility dropping the more jargon there is. But I don’t think the case for that is very strong. I think well-placed jargon makes ideas and whole paradigms a lot easier to learn, and therefore more accessible. Furthermore, I think it’s worth making a distinction between community-accessibility and idea-accessibility. A lot of jargon makes it harder to participate in the community without a greater understanding of the ideas, but they also makes it easier to understand those ideas in the first place. The net effect is probably that it creates a clearer separation between people inside and outside the community, with fewer gradients in between.
Re your concern about epistemic closure (thanks for the jargon btw!)[1]: I think if the community had more widespread healthier epistemic norms, we would be more effective at adopting new ideas and reason rationally about outside perspectives, and be more willing to change our ways en masse if that outside perspective is actually better. The rationality community is qualitatively different than other communities because prestige is to a large extent determined by one’s ability to avoid epistemic failure modes, such that the usual “cult warning signs” apply to a lesser degree.
Saying heretical stuff here, I know, but I did disclaim deferral status in the first sentence, so should be safe : )
Final point: bureaucratic acronyms are just way terribler than Internet-slang acronyms, tho! :p
This is not meant to be a critique, I just found the irony a little funny. I appreciate your comment, and learning about epistemic closure from the link.
If there were any evidence of that, I would be far more willing to tolerate the increased coinage of acronyms and jargon. But, in my experience, the increased coinage of acronyms and jargon is a sign of the opposite occurring. It’s a sign of ossification and an inward turn as practitioners of the field become more concerned with talking with each other and forming status hierarchies than with actually advancing the field.
The phrase “well-placed” is doing a lot of work here. The natural incentives around jargon creation do not favor making jargon that simplifies concepts and makes them easier for outsiders to learn. The natural incentives around jargon creation favor making concepts more opaque, such that demonstrating mastery of the jargon serves as a form of status marker, distinguishing insiders from pretenders.
It isn’t. The rationality community is a community of smart, literate, eloquent people who are working to understand and improve both individual and collective decision-making, especially as it relates to artificial intelligence and related technologies. Qualitatively, we’re not different from other groups of smart, literate, eloquent people who are grappling with other difficult ideas. I don’t see much of a difference, for example, between the rationality community and the foreign policy community.
Absolutely false. The usual cult warning signs apply (and should apply) with exactly the same severity to the rationality community as they do to any other. And when they don’t, we end up with things like Leverage Research, which are, at best ineffectual and at worst, harmful to participants.
Thank you for this discussion btw, this is helpfwl. I suspect it’s hitting diminishing returns unless we hone in on practical specifics.
I think our levels of faith in the rationality community is a crux. Here’s what I think, although I would stress again that although I tentatively believe what I say, I am not trying to be safe to defer to. Thus, I omit disclaimers and caveats and instead try to provide perspectives for evaluation. I think this warning is especially prudent here.
We have a really strong jargon hit-rate
The “natural incentives around jargon creation” in most communities favour usefwlness much less compared to this community. I can think of some examples of historically bad jargon:
“Politics is the mind killer” (not irredeemably bad, but net bad nonetheless imo)
“Bayesian”
Not confident here, but I think the term expanded too far from its roots, plus overemphasised. This could be prevented either by an increased willingness to use new terms for neighbouring semantic space, or an increased unwillingness to expand the use shibboleths for new things.
“NPC” (non-player character)
Not irredeemable, but questionable net value.
Probably more here but I can’t recall.
I think our hit-rate so far on jargon has been remarkably good. Even under the assumption that increased coinage reduces accuracy (which I weakly disagree with), it seems on the margin plausible that it will take us closer to the pareto frontier.
I am less worried about becoming marginally more insular
Our collective project is exceedingly dangerous. We’re deactivating our memetic immune system and fumbling towards deliberate epistemic practices that we hope can make up for it. I think rationality education must consist of lowering intuitive defenses in tandem with growing epistemological awareness. And in cases where this education is out of sync, it produces victims.
But I’d be wary of updating too much on Leverage as an indictment of rationality culture in general. That kind of defensiveness is the same mechanism by which hospitals get bureaucrified—they’re minimising false-positives at the cost of everything else.
I suspect that our community’s cultural inclination against these failure modes makes it more likely that our epistemic norms weaken with widespread social integration with other cultures.
I also think, more generally, that norms/advice that were necessary early on, could nowadays actively be hampering our progress. “Be less sure of yourself, seek wisdom from outside sources, etc.” is necessary advice for someone just starting out on the path, but at some point your wisdom so far exceeds outside sources that the advice hits diminishing returns—tune yourself to where you sniff out value of information, whether that be insular or not.