I agree that LessWrong is not an exclusive source of most/all ideas found here.
I think this means less than (I think) you are trying to suggest. For example, right now I am reading a textbook on set theory, and although I am pretty sure that 99-100% of the information there could also be found in other sources, that it not a sufficient reason to throw the textbook away. There are other possible advantages, such as being more accessible, putting all the information in the same place and showing the connection.
Another important thing is what is not included. Like, if you show me a set of people who read “Thinking Fast and Slow” and “Predictably Irrational”, I would expect that many of them have also enjoyed reading Malcolm Gladwell and Nassim Taleb, etc. You know, these things are a genre, and yes if you read a lot of this genre, you will be familiar with the good ideas in the LW Sequences. But the genre also comes with a lot of strong beliefs that do not replicate. (Talking for 10 minutes with someone who reads Taleb’s tweets regularly makes me want to scream.)
Then, there is the community. Reading the books is nice, but then I typically want to discuss them with someone. In extreme case, discuss the ways how the things we learned could be applied to improve our everyday lives. (And again, what is excluded is just as important as what is included.)
But the genre also comes with a lot of strong beliefs that do not replicate. (Talking for 10 minutes with someone who reads Taleb’s tweets regularly makes me want to scream.)
By this criterion, absolutely no one should be using LessWrong as a vehicle for learning. The Malcolm Gladwell reader you proposed might have been a comparable misinformation vehicle, in, say, 2011, but as of 2022 LessWrong is by a chasmic margin worse about this. It’s debatable whether the average LessWrong user even reads what they’re talking about anymore.
I can name a real-life example: in a local discord of about 100 people, Aella argued that the MBTI is better understood holistically under the framework of Jungian psychology, and that looking at the validity of each subtest (e.g. “E/I”, “N/S”, “”T/F”, “J/P”) is wrongly reductive. This is not just incorrect, it is the opposite of true; it fundamentally misunderstands what psychometric validity even is. I wrote a fairly long correction of this, but I am not sure anyone bothered to read it — most people will take what community leaders say at face value, because the mission statement of the ingroup of LessWrong is “people who are rational” and the thinking goes that someone who is rational, surely, would have taken care of this. (This was not at all the case.)
I don’t think further examples will help, but they are abundant throughout this sphere; there is a reason I spent 30 minutes of that audio debunking the pseudoscientific and even quasi-mystical beliefs common to Alexander Kruel’s sphere of influence.
Aella argued that the MBTI is better understood holistically under the framework of Jungian psychology, and that looking at the validity of each subtest (e.g. “E/I”, “N/S”, “”T/F”, “J/P”) is wrongly reductive. This is not just incorrect, it is the opposite of true; it fundamentally misunderstands what psychometric validity even is.
I 100% agree.
From my perspective, the root of the problem is that we do not have clear boundaries for {rationality, Less Wrong, this kind of stuff}. Because if I tell you “hey, from my perspective Aella is not a community leader, and if she posted such claims on LW they would get downvoted”, on one hand, I sincerely mean it, and this was the reply I originally wanted to write; on the other hand, I would understand if that seemed like a motte-and-bailey definition of the rationalist community… whenever someone says something embarassing, we quickly redefine the rationalist community to mean “not this person, or at least not this specific embarassing statement”. Especially considering that Aella is known to many people in the rationalist community, and occasionally posts on Less Wrong (not about MBTI though).
I would more strongly object against associating LW with “quasi-mystical beliefs common to Alexander Kruel’s sphere of influence”. I mean, Alexander Kruel is like the #2 greatest hater of Less Wrong (the #1 place belongs to David Gerard), so it does not make any sense to me to blame his opinions on us.
Scott uses the term “rationalist-adjacent” for people who hang out with actual LW members and absorb some of their memes, but also disagree with them in many ways. So, from my perspective, “rationalist proper” is what is written on Less Wrong (both the articles and the comments), plus most of what Eliezer or Scott write on other places; and “rationalist adjacent” is the comment section of ACX, discord, meetups, etc., including Aella (and also—although I hate to admit it—Alexander Kruel).
I agree that the “rationalist adjacent” sphere is full of pseudoscientific bullshit. Not sure if strictly worse, but definitely not better than Malcolm Gladwell. :(
I am not sure what to do about this. I am pretty sure that many other communities have a similar problem, too. You have a few “hardcore members”, and then you have many people who enjoy hanging out with them… how should you properly explain that “these things are popular among the people who hang out with us, but they are actually not popular among the hardcore members”?
It seems to me that people who enjoy reading the { Gladwell, Taleb, Arieli, Kahneman… } genre seem usually happy when they find Less Wrong. It is yet another source of insight porn they can add to their collection and expand their vocabulary with. The only problem is the intolerance of Less Wrong against some ideas (such as religion), but this is not a problem if you simply cherry-pick the parts you like. Those people usually do not identify as rationalists.
Now we would need a nice way to signal that certain forum is full of people familiar with the rationalist memes, but despite that, most of them are not rationalists. Such as the ACX comment section, and probably the discord you mentioned.
1. https://www.amazon.com/Cambridge-Handbook-Reasoning-Handbooks-Psychology/dp/0521531012
2. https://www.amazon.com/Rationality-What-Seems-Scarce-Matters/dp/B08X4X4SQ4
3. https://www.amazon.com/Cengage-Advantage-Books-Understanding-Introduction/dp/1285197364
4. https://www.amazon.com/Thinking-Fast-Slow-Daniel-Kahneman/dp/0374533555
5. https://www.amazon.com/Predictably-Irrational-audiobook/dp/B0014EAHNQ
6. https://www.amazon.com/BIASES-HEURISTICS-Collection-Heuristics-Everything/dp/1078432317
7. https://www.amazon.com/Informal-Logical-Fallacies-Brief-Guide/dp/0761854339
there is very little, with respect to rationality, learned here that will not be learned through these texts.
I agree that LessWrong is not an exclusive source of most/all ideas found here.
I think this means less than (I think) you are trying to suggest. For example, right now I am reading a textbook on set theory, and although I am pretty sure that 99-100% of the information there could also be found in other sources, that it not a sufficient reason to throw the textbook away. There are other possible advantages, such as being more accessible, putting all the information in the same place and showing the connection.
Another important thing is what is not included. Like, if you show me a set of people who read “Thinking Fast and Slow” and “Predictably Irrational”, I would expect that many of them have also enjoyed reading Malcolm Gladwell and Nassim Taleb, etc. You know, these things are a genre, and yes if you read a lot of this genre, you will be familiar with the good ideas in the LW Sequences. But the genre also comes with a lot of strong beliefs that do not replicate. (Talking for 10 minutes with someone who reads Taleb’s tweets regularly makes me want to scream.)
Then, there is the community. Reading the books is nice, but then I typically want to discuss them with someone. In extreme case, discuss the ways how the things we learned could be applied to improve our everyday lives. (And again, what is excluded is just as important as what is included.)
By this criterion, absolutely no one should be using LessWrong as a vehicle for learning. The Malcolm Gladwell reader you proposed might have been a comparable misinformation vehicle, in, say, 2011, but as of 2022 LessWrong is by a chasmic margin worse about this. It’s debatable whether the average LessWrong user even reads what they’re talking about anymore.
I can name a real-life example: in a local discord of about 100 people, Aella argued that the MBTI is better understood holistically under the framework of Jungian psychology, and that looking at the validity of each subtest (e.g. “E/I”, “N/S”, “”T/F”, “J/P”) is wrongly reductive. This is not just incorrect, it is the opposite of true; it fundamentally misunderstands what psychometric validity even is. I wrote a fairly long correction of this, but I am not sure anyone bothered to read it — most people will take what community leaders say at face value, because the mission statement of the ingroup of LessWrong is “people who are rational” and the thinking goes that someone who is rational, surely, would have taken care of this. (This was not at all the case.)
I don’t think further examples will help, but they are abundant throughout this sphere; there is a reason I spent 30 minutes of that audio debunking the pseudoscientific and even quasi-mystical beliefs common to Alexander Kruel’s sphere of influence.
I 100% agree.
From my perspective, the root of the problem is that we do not have clear boundaries for {rationality, Less Wrong, this kind of stuff}. Because if I tell you “hey, from my perspective Aella is not a community leader, and if she posted such claims on LW they would get downvoted”, on one hand, I sincerely mean it, and this was the reply I originally wanted to write; on the other hand, I would understand if that seemed like a motte-and-bailey definition of the rationalist community… whenever someone says something embarassing, we quickly redefine the rationalist community to mean “not this person, or at least not this specific embarassing statement”. Especially considering that Aella is known to many people in the rationalist community, and occasionally posts on Less Wrong (not about MBTI though).
I would more strongly object against associating LW with “quasi-mystical beliefs common to Alexander Kruel’s sphere of influence”. I mean, Alexander Kruel is like the #2 greatest hater of Less Wrong (the #1 place belongs to David Gerard), so it does not make any sense to me to blame his opinions on us.
Scott uses the term “rationalist-adjacent” for people who hang out with actual LW members and absorb some of their memes, but also disagree with them in many ways. So, from my perspective, “rationalist proper” is what is written on Less Wrong (both the articles and the comments), plus most of what Eliezer or Scott write on other places; and “rationalist adjacent” is the comment section of ACX, discord, meetups, etc., including Aella (and also—although I hate to admit it—Alexander Kruel).
I agree that the “rationalist adjacent” sphere is full of pseudoscientific bullshit. Not sure if strictly worse, but definitely not better than Malcolm Gladwell. :(
I am not sure what to do about this. I am pretty sure that many other communities have a similar problem, too. You have a few “hardcore members”, and then you have many people who enjoy hanging out with them… how should you properly explain that “these things are popular among the people who hang out with us, but they are actually not popular among the hardcore members”?
It seems to me that people who enjoy reading the { Gladwell, Taleb, Arieli, Kahneman… } genre seem usually happy when they find Less Wrong. It is yet another source of insight porn they can add to their collection and expand their vocabulary with. The only problem is the intolerance of Less Wrong against some ideas (such as religion), but this is not a problem if you simply cherry-pick the parts you like. Those people usually do not identify as rationalists.
Now we would need a nice way to signal that certain forum is full of people familiar with the rationalist memes, but despite that, most of them are not rationalists. Such as the ACX comment section, and probably the discord you mentioned.