The very concept of a “rationalist” is an egregious one! What is a rationalist, really? The motte: “one who studies the methods of rationality, systematic methods of thought that result in true beliefs and goal achievement”. The bailey: “a member of the social ingroup of Eliezer Yudkowsky and Scott Alexander fans, and their friends.”
Yeah, this already bothered me some, but your way of putting it here makes it bother me more.
I think the motte/bailey often runs in the other direction, though, for modesty-ish reasons: there’s a temptation to redefine ‘rationalist’ as a social concept, because it looks more humble to say ‘I’m in social circle X’ than to say ‘I’m part of important project X’ or ‘I’m a specialist in X’, when you aren’t doing X-stuff as part of a mainstream authority like academia.
I think there are two concepts I tend to want labels for, which I sometimes use ‘rationalist’ to refer to (though I hope I’m not switching between these in a deceptive/manipulative way!):
‘One who successfully develops and/or applies the methods of getting systematically better at mapping reality (and, optionally, steering reality) from inside a human brain.’
‘One who is highly acquainted with the kinds of ideas in the Sequences, and with related ideas that have been a major topic on LW (e.g., the availability heuristic and reductionism and conservation of expected evidence, but also ems, tractability/importance/neglectedness, updateless decision theory, ugh fields, ideas from Alicorn’s Twilight fanfic...).’
I think the latter concept is pretty useful and natural, though I could maybe be convinced that ‘rationalist’ is a bad name for it.
I think it’s mainly a memetic or cultural concept in my mind, like ‘postmodernist’ or ‘Chicago economist’ or ‘the kind of person who’s been to a coding boot camp’: shibboleths are a big deal for who my brain tags this way, but mere social adjacency isn’t. It’s closer to ‘how much background knowledge can I assume if we start talking about simulation shutdown at a party?‘, rather than ‘how much does e.g. Scott Alexander like this person, spend time with them, etc.?’.
Thinking about it more: I can imagine a group that tries to become unusually good at learning true things in a pretty domain-general way, so they call themselves the Learnies, or the Discoverers, or the Trutheteers.
If a group like that succeeds in advancing its art, then it should end up using that art to discover at least some truths about the world at large that aren’t widely known. This should be a reliable consequence of ‘getting good at discovering truths’. And in many such worlds, it won’t be trivial for the group to then convince the entire rest of the world to believe the things they learned overnight. So five years might pass and the Learnies are still the main group that is aware of, say, the medical usefulness of some Penicillium molds (because they and/or the world they’re in is dysfunctional in some way).
It seems natural to me that the Learnies’ accumulated learnings get mentally associated with the group, even though penicillin doesn’t have any special connection to the art of learning itself. So I don’t think there’s anything odd about rationalists2 being associated with ‘knowledge we accumulated in the process of applying rationality techniques’, or about my wanting a way to figure out to what extent someone at a party has acquired that knowledge. I think this example does illustrate a few issues, though:
First, obviously, it’s important to be able to distinguish ‘stuff the Learnies learned that’s part of the art of learning’ from ‘other stuff the Learnies learned’. Possibly it would help here to have more names for the clusters of nonstandard beliefs rationalists2 tend to have, so I’m somewhat less tempted to think ‘is this person familiar with the rationalist2-ish content on ems?’ vs. ‘is this person familiar with the techno-reductionist content on ems?’ or whatever.
Second, I’ve been describing the stuff above as “knowledge”. But what if there’s a dispute about whether the Learnies’ worldly discoveries are true? In that case, maybe it’s logically rude to call yourself ‘Learnies’, because (a) it maybe rhetorically tilts the debate in your favor, and (b) it’s just not very informative. (Example: A world where pro-penicillin people call themselves Penicillists and anti-penicillin people call themselves Anti-Penicillists is better than a world where pro-penicillin people call themselves Accuracyists and anti-penicillin people call themselves Realityists.)
Considerations like this update me toward using the term ‘aspiring rationalist’ more, because it hopefully makes it tilts debate less toward the presupposition that ‘whoever identifies themselves as a truth-seeker’ is correct about whatever thing they think their rationality techniques have helped them learn.
(Though I think ‘aspiring rationalist’ still tilts the debate some. But, thinking about it again, maybe I actually prefer the world where more people make one flavor or another of ‘truth-seeking’ part of their identity? It might be at least somewhat easier to call people out on bullshit and make them reflect and update, if it were more common for people to take pride in their truth-seeking and knowledgeableness, however unwarranted.)
Then there’s the question of whether the Learnies’ commitment to the art of learning obliges them to never Glomarize or stay-out-of-the-fray about politically controversial things they learn. I take it that something along these lines is your main criticism of rationalists2 calling themselves ‘rationalists’, but I’m not sure exactly what norm you’re advocating. Currently I don’t find this compelling—like, I do think it’s epistemically risky for groups to selectively avoid talking about important politicized topics, and I think it’s important to try to find ways to counter resultant epistemic distortions in those cases (and to seriously consider whether staying-out-of-the-fray is just a dumb strategy). But I guess I just disagree with some of the examples you’ve provided, and agree with others but don’t think they’re as serious or deeply-epistemic-corrupting as you do? Mostly, I just don’t think I’ve read enough of the stuff you’ve written to understand your full argument; but I can at least give my current epistemic state.
Just—the absolute gall of that motherfucker! I still need to finish my memoir about why I don’t trust him the way I used to, but it’s just so emotionally hard—like a lifelong devout Catholic denouncing the Pope. But what can you do when the Pope is actually wrong? My loyalty is to the truth, not to him.
This doesn’t seem to be about the term rationalist at all. It seems to be about which rhetorical style different people prefer. Eliezer makes in a much more confident and more polarizing way then Scott.
In my experience Scott has an epistemic style where he assumes and seeks out contrary information, and Eliezer does not...he’s more into early cognitive closure. It’s not just tone, it’s method.
No, not really? I generally ignore anything Scott writes which could be described as ‘agreeing with Yud’—it’s his other work I find valuable, work I wouldn’t expect Yud to write in any style.
The very concept of a “rationalist” is an egregious one! What is a rationalist, really? The motte: “one who studies the methods of rationality, systematic methods of thought that result in true beliefs and goal achievement”. The bailey: “a member of the social ingroup of Eliezer Yudkowsky and Scott Alexander fans, and their friends.”
Yeah, this already bothered me some, but your way of putting it here makes it bother me more.
I think the motte/bailey often runs in the other direction, though, for modesty-ish reasons: there’s a temptation to redefine ‘rationalist’ as a social concept, because it looks more humble to say ‘I’m in social circle X’ than to say ‘I’m part of important project X’ or ‘I’m a specialist in X’, when you aren’t doing X-stuff as part of a mainstream authority like academia.
I think there are two concepts I tend to want labels for, which I sometimes use ‘rationalist’ to refer to (though I hope I’m not switching between these in a deceptive/manipulative way!):
‘One who successfully develops and/or applies the methods of getting systematically better at mapping reality (and, optionally, steering reality) from inside a human brain.’
‘One who is highly acquainted with the kinds of ideas in the Sequences, and with related ideas that have been a major topic on LW (e.g., the availability heuristic and reductionism and conservation of expected evidence, but also ems, tractability/importance/neglectedness, updateless decision theory, ugh fields, ideas from Alicorn’s Twilight fanfic...).’
I think the latter concept is pretty useful and natural, though I could maybe be convinced that ‘rationalist’ is a bad name for it.
I think it’s mainly a memetic or cultural concept in my mind, like ‘postmodernist’ or ‘Chicago economist’ or ‘the kind of person who’s been to a coding boot camp’: shibboleths are a big deal for who my brain tags this way, but mere social adjacency isn’t. It’s closer to ‘how much background knowledge can I assume if we start talking about simulation shutdown at a party?‘, rather than ‘how much does e.g. Scott Alexander like this person, spend time with them, etc.?’.
Thinking about it more: I can imagine a group that tries to become unusually good at learning true things in a pretty domain-general way, so they call themselves the Learnies, or the Discoverers, or the Trutheteers.
If a group like that succeeds in advancing its art, then it should end up using that art to discover at least some truths about the world at large that aren’t widely known. This should be a reliable consequence of ‘getting good at discovering truths’. And in many such worlds, it won’t be trivial for the group to then convince the entire rest of the world to believe the things they learned overnight. So five years might pass and the Learnies are still the main group that is aware of, say, the medical usefulness of some Penicillium molds (because they and/or the world they’re in is dysfunctional in some way).
It seems natural to me that the Learnies’ accumulated learnings get mentally associated with the group, even though penicillin doesn’t have any special connection to the art of learning itself. So I don’t think there’s anything odd about rationalists2 being associated with ‘knowledge we accumulated in the process of applying rationality techniques’, or about my wanting a way to figure out to what extent someone at a party has acquired that knowledge. I think this example does illustrate a few issues, though:
First, obviously, it’s important to be able to distinguish ‘stuff the Learnies learned that’s part of the art of learning’ from ‘other stuff the Learnies learned’. Possibly it would help here to have more names for the clusters of nonstandard beliefs rationalists2 tend to have, so I’m somewhat less tempted to think ‘is this person familiar with the rationalist2-ish content on ems?’ vs. ‘is this person familiar with the techno-reductionist content on ems?’ or whatever.
Second, I’ve been describing the stuff above as “knowledge”. But what if there’s a dispute about whether the Learnies’ worldly discoveries are true? In that case, maybe it’s logically rude to call yourself ‘Learnies’, because (a) it maybe rhetorically tilts the debate in your favor, and (b) it’s just not very informative. (Example: A world where pro-penicillin people call themselves Penicillists and anti-penicillin people call themselves Anti-Penicillists is better than a world where pro-penicillin people call themselves Accuracyists and anti-penicillin people call themselves Realityists.)
Considerations like this update me toward using the term ‘aspiring rationalist’ more, because it hopefully makes it tilts debate less toward the presupposition that ‘whoever identifies themselves as a truth-seeker’ is correct about whatever thing they think their rationality techniques have helped them learn.
(Though I think ‘aspiring rationalist’ still tilts the debate some. But, thinking about it again, maybe I actually prefer the world where more people make one flavor or another of ‘truth-seeking’ part of their identity? It might be at least somewhat easier to call people out on bullshit and make them reflect and update, if it were more common for people to take pride in their truth-seeking and knowledgeableness, however unwarranted.)
Then there’s the question of whether the Learnies’ commitment to the art of learning obliges them to never Glomarize or stay-out-of-the-fray about politically controversial things they learn. I take it that something along these lines is your main criticism of rationalists2 calling themselves ‘rationalists’, but I’m not sure exactly what norm you’re advocating. Currently I don’t find this compelling—like, I do think it’s epistemically risky for groups to selectively avoid talking about important politicized topics, and I think it’s important to try to find ways to counter resultant epistemic distortions in those cases (and to seriously consider whether staying-out-of-the-fray is just a dumb strategy). But I guess I just disagree with some of the examples you’ve provided, and agree with others but don’t think they’re as serious or deeply-epistemic-corrupting as you do? Mostly, I just don’t think I’ve read enough of the stuff you’ve written to understand your full argument; but I can at least give my current epistemic state.
It doesn’t help when Yudkowsky actively encourages this confusion! As he Tweeted today: “Anyways, Scott, this is just the usual division of labor in our caliphate: we’re both always right, but you cater to the crowd that wants to hear it from somebody too modest to admit that, and I cater to the crowd that wants somebody out of that closet.”
Just—the absolute gall of that motherfucker! I still need to finish my memoir about why I don’t trust him the way I used to, but it’s just so emotionally hard—like a lifelong devout Catholic denouncing the Pope. But what can you do when the Pope is actually wrong? My loyalty is to the truth, not to him.
This doesn’t seem to be about the term rationalist at all. It seems to be about which rhetorical style different people prefer. Eliezer makes in a much more confident and more polarizing way then Scott.
In my experience Scott has an epistemic style where he assumes and seeks out contrary information, and Eliezer does not...he’s more into early cognitive closure. It’s not just tone, it’s method.
No, not really? I generally ignore anything Scott writes which could be described as ‘agreeing with Yud’—it’s his other work I find valuable, work I wouldn’t expect Yud to write in any style.
I made a similar, but slightly different argument in Pseudo-Rationality:
“Pseudo-rationality is the social performance of rationality, as opposed to actual rationality.”