Thinking about it more: I can imagine a group that tries to become unusually good at learning true things in a pretty domain-general way, so they call themselves the Learnies, or the Discoverers, or the Trutheteers.
If a group like that succeeds in advancing its art, then it should end up using that art to discover at least some truths about the world at large that aren’t widely known. This should be a reliable consequence of ‘getting good at discovering truths’. And in many such worlds, it won’t be trivial for the group to then convince the entire rest of the world to believe the things they learned overnight. So five years might pass and the Learnies are still the main group that is aware of, say, the medical usefulness of some Penicillium molds (because they and/or the world they’re in is dysfunctional in some way).
It seems natural to me that the Learnies’ accumulated learnings get mentally associated with the group, even though penicillin doesn’t have any special connection to the art of learning itself. So I don’t think there’s anything odd about rationalists2 being associated with ‘knowledge we accumulated in the process of applying rationality techniques’, or about my wanting a way to figure out to what extent someone at a party has acquired that knowledge. I think this example does illustrate a few issues, though:
First, obviously, it’s important to be able to distinguish ‘stuff the Learnies learned that’s part of the art of learning’ from ‘other stuff the Learnies learned’. Possibly it would help here to have more names for the clusters of nonstandard beliefs rationalists2 tend to have, so I’m somewhat less tempted to think ‘is this person familiar with the rationalist2-ish content on ems?’ vs. ‘is this person familiar with the techno-reductionist content on ems?’ or whatever.
Second, I’ve been describing the stuff above as “knowledge”. But what if there’s a dispute about whether the Learnies’ worldly discoveries are true? In that case, maybe it’s logically rude to call yourself ‘Learnies’, because (a) it maybe rhetorically tilts the debate in your favor, and (b) it’s just not very informative. (Example: A world where pro-penicillin people call themselves Penicillists and anti-penicillin people call themselves Anti-Penicillists is better than a world where pro-penicillin people call themselves Accuracyists and anti-penicillin people call themselves Realityists.)
Considerations like this update me toward using the term ‘aspiring rationalist’ more, because it hopefully makes it tilts debate less toward the presupposition that ‘whoever identifies themselves as a truth-seeker’ is correct about whatever thing they think their rationality techniques have helped them learn.
(Though I think ‘aspiring rationalist’ still tilts the debate some. But, thinking about it again, maybe I actually prefer the world where more people make one flavor or another of ‘truth-seeking’ part of their identity? It might be at least somewhat easier to call people out on bullshit and make them reflect and update, if it were more common for people to take pride in their truth-seeking and knowledgeableness, however unwarranted.)
Then there’s the question of whether the Learnies’ commitment to the art of learning obliges them to never Glomarize or stay-out-of-the-fray about politically controversial things they learn. I take it that something along these lines is your main criticism of rationalists2 calling themselves ‘rationalists’, but I’m not sure exactly what norm you’re advocating. Currently I don’t find this compelling—like, I do think it’s epistemically risky for groups to selectively avoid talking about important politicized topics, and I think it’s important to try to find ways to counter resultant epistemic distortions in those cases (and to seriously consider whether staying-out-of-the-fray is just a dumb strategy). But I guess I just disagree with some of the examples you’ve provided, and agree with others but don’t think they’re as serious or deeply-epistemic-corrupting as you do? Mostly, I just don’t think I’ve read enough of the stuff you’ve written to understand your full argument; but I can at least give my current epistemic state.
Thinking about it more: I can imagine a group that tries to become unusually good at learning true things in a pretty domain-general way, so they call themselves the Learnies, or the Discoverers, or the Trutheteers.
If a group like that succeeds in advancing its art, then it should end up using that art to discover at least some truths about the world at large that aren’t widely known. This should be a reliable consequence of ‘getting good at discovering truths’. And in many such worlds, it won’t be trivial for the group to then convince the entire rest of the world to believe the things they learned overnight. So five years might pass and the Learnies are still the main group that is aware of, say, the medical usefulness of some Penicillium molds (because they and/or the world they’re in is dysfunctional in some way).
It seems natural to me that the Learnies’ accumulated learnings get mentally associated with the group, even though penicillin doesn’t have any special connection to the art of learning itself. So I don’t think there’s anything odd about rationalists2 being associated with ‘knowledge we accumulated in the process of applying rationality techniques’, or about my wanting a way to figure out to what extent someone at a party has acquired that knowledge. I think this example does illustrate a few issues, though:
First, obviously, it’s important to be able to distinguish ‘stuff the Learnies learned that’s part of the art of learning’ from ‘other stuff the Learnies learned’. Possibly it would help here to have more names for the clusters of nonstandard beliefs rationalists2 tend to have, so I’m somewhat less tempted to think ‘is this person familiar with the rationalist2-ish content on ems?’ vs. ‘is this person familiar with the techno-reductionist content on ems?’ or whatever.
Second, I’ve been describing the stuff above as “knowledge”. But what if there’s a dispute about whether the Learnies’ worldly discoveries are true? In that case, maybe it’s logically rude to call yourself ‘Learnies’, because (a) it maybe rhetorically tilts the debate in your favor, and (b) it’s just not very informative. (Example: A world where pro-penicillin people call themselves Penicillists and anti-penicillin people call themselves Anti-Penicillists is better than a world where pro-penicillin people call themselves Accuracyists and anti-penicillin people call themselves Realityists.)
Considerations like this update me toward using the term ‘aspiring rationalist’ more, because it hopefully makes it tilts debate less toward the presupposition that ‘whoever identifies themselves as a truth-seeker’ is correct about whatever thing they think their rationality techniques have helped them learn.
(Though I think ‘aspiring rationalist’ still tilts the debate some. But, thinking about it again, maybe I actually prefer the world where more people make one flavor or another of ‘truth-seeking’ part of their identity? It might be at least somewhat easier to call people out on bullshit and make them reflect and update, if it were more common for people to take pride in their truth-seeking and knowledgeableness, however unwarranted.)
Then there’s the question of whether the Learnies’ commitment to the art of learning obliges them to never Glomarize or stay-out-of-the-fray about politically controversial things they learn. I take it that something along these lines is your main criticism of rationalists2 calling themselves ‘rationalists’, but I’m not sure exactly what norm you’re advocating. Currently I don’t find this compelling—like, I do think it’s epistemically risky for groups to selectively avoid talking about important politicized topics, and I think it’s important to try to find ways to counter resultant epistemic distortions in those cases (and to seriously consider whether staying-out-of-the-fray is just a dumb strategy). But I guess I just disagree with some of the examples you’ve provided, and agree with others but don’t think they’re as serious or deeply-epistemic-corrupting as you do? Mostly, I just don’t think I’ve read enough of the stuff you’ve written to understand your full argument; but I can at least give my current epistemic state.