This page was written in 2015 and imported from the old LessWrong wiki in 2020. If this page is your first exposure to LessWrong, we recommend you starting with Welcome to LessWrong! which serves as the up-to-date About and Welcome page.
The Purpose of the Introduction to LessWrong
We don’t want new users to be downvoted a lot and most of you don’t like it, either. This guide was created to help new users quickly get the gist of what LessWrong subculture is about and how website participation works to help new users gain some orientation. If you came here on your own, that’s excellent because if you attempt to participate on the website without all the information in this introduction, there’s a pretty good chance that you’ll be pretty lost.
What is LessWrong?
LessWrong refers to a website for a specific rationalist subculture. The main website feature is a blog. In addition to hosting user-generated posts and articles, the LessWrong blog also hosts LessWrong’s main collection of writings. This collection, called “The Sequences”, is about rationality and was written by a variety of authors. The term “LessWrong” is also used as a term to describe the related IRL Meetups (“LessWrong Meetups”).
What are The Sequences?
The main document that has influenced LessWrong subculture is called “The Sequences”, written by a variety of authors but mostly by Eliezer Yudkowsky. The main theme of The Sequences is rationality. The Sequences reflect a lot of the research done on reasoning mistakes (like cognitive biases) by people such as Daniel Kahneman (a prominent cognitive bias researcher). The two main differences between The Sequences and Daniel Kahneman’s work are that Eliezer has a very engaging writing style, while Kahneman is notoriously dry, and Eliezer has taken care to warn readers about a variety of pitfalls involved in learning about cognitive biases. Other themes in The Sequences include artificial intelligence, software engineering, math, science, and atheism.
A Gist of the LessWrong Rationalist Subculture
The main document that has influenced LessWrong subculture is the Sequences. The main theme of the Sequences may be rationality but there are many other themes in the Sequences which have influenced the subculture as well. These other themes may be why the subculture has attracted a disproportionate number of software engineers, math and science oriented individuals, people with an interest in artificial intelligence, atheists, etc. More importantly, the Sequences also contain a lot of articles with Eliezer’s ideas about rationalist culture. If you have no familiarity with the cultural articles and other themes before you begin interacting, your social experiences are likely to be highly awkward. The rationalist way of thinking and subculture is extremely, extremely complex. To give you a gist of how complex it is and what kind of complexity you’ll encounter:
Imagine being transported to a different country without ever having heard of that country before. You would have little hope of success in that society without first learning about the many differences between your cultures. That is how different LessWrong culture is from the mainstream culture. It’s not just a little bit different like so many other subcultures where one can mingle undetected by making a commitment to a few specific cultural beliefs, wearing certain clothes and using a few selections of subculture-specific verbiage. Instead of adopting a specific group of beliefs, rationalists have taken it quite a bit further and have adopted a different way of choosing beliefs. Instead of focusing on dress, they have focused on learning. Instead of using a few dozen subculture-specific terms, there are hundreds and hundreds of vocabulary words. There are a few things you should know about what this fundamentally different way results in so that you will have a grasp of the scope of the difference:
1. Unlike subcultures that form around politically-oriented positions, rationalists are wary of making commitments to beliefs. If one wants to be rational, one should accept an idea only because there is good evidence that the idea is likely to be true, not because one had previously chosen to be “on” a certain “side”. Unlike people in religious groups, rationalists do not accept ideas on faith, even if they are presented by an authority figure. Instead, they learn to consider the specific supports for each idea and determine which ones are most likely to be correct. Unlike many people in the mainstream, rationalists are wary of conforming to beliefs merely because other rationalists promote the beliefs. There is no body of knowledge that rationalists cling to and defend as if “Guarding the Truth”. Instead, rationalist subculture is about discovering and making progress.
There is no holy book, authority, set of political agendas, or set of popular beliefs that we can point you to in order to tell you which beliefs rationalists have. It would not be in the best interest of a rationalist to cling to beliefs by defining themselves with a set of specific beliefs. Instead, we can point you to various methods we might use for choosing beliefs like Bayesianism. Using Bayesian probabilities is considered by many in this subculture to be one of the most fundamental and most prominent parts of the reasoning toolbox.
2. The amount of difference between this culture and mainstream culture cannot be expressed well by presenting a short list of differences. This is because when a group’s main difference consists of a different way of choosing beliefs, this results in the group choosing a very large number of things that differ from the mainstream, not just a short list. This different way of choosing beliefs is involved. There are over a hundred cognitive biases that humans are affected by that rationalists aim to avoid. Imagine you added over one hundred improvements to your way of thinking. How many future situations would you make a different choice in? Imagine at least that many differences when you think about what rationalists are like.
3. It takes a very long time to become good at being rational. To be a really good reasoner, you need to patch over a hundred of cognitive biases. Rationalists improve their rationality because it’s necessary if you want to make good decisions. Good decisions are, of course, necessary if you want a high degree of success in life and learning about biases is necessary just to help you avoid self-destructive decisions. Becoming more rational requires an investment. There is no quick fix. There is a lot to learn. Until you’ve invested a lot into learning, many of the people you’ll encounter in the subculture will know a lot more about this than you do. Interacting with this subculture isn’t like talking about a couple dozen bands and enjoying the same music. Daniel Kahneman’s book “Judgment under Uncertainty: Heuristics and Biases” is around 600 pages long. Becoming knowledgeable about rationality is an investment. The Sequences would take in the ballpark of 80 hours to read at the average reading speed. Becoming knowledgeable about this specific subculture is an investment.
To resist Dunning–Kruger effect (mistakenly believing you know more about a subject than you do, possibly because you simply weren’t clued in to how vast the subject is), and to make the depth and breadth of this subculture seem more real to you, you could begin by browsing the titles of articles in the Sequences. That’s the closest thing there currently is to an index of the subculture. That can be found here: Sequences.
Website Participation Intro or “Why am I being downvoted?”
On the main website feature, the community blog, there are two areas. One area is called “posts” or “discussions” while the other area is called “articles” or “main”. Don’t be fooled by the casual-sounding titles “posts” / “discussions”. Members do not treat the posts/discussions area as a casual place for chatting or as a message board. The posts/discussions area is treated more like a community blog. The social norms are:
1. Either write something deemed useful, or go to the open thread.
Many members want to keep up with all of the posts/discussion submissions as well as all of the articles/main submissions, as if keeping up with the news. For this reason, they experience it as inefficient when there are submissions about minor details, off-topic submissions, submissions on topics that have already been covered, and meta threads (submissions about posting, about the website, about the subculture, etc.). If you want to converse about any of those things, find the most recent version of a post labeled “Open Thread” to put them in. Somebody, often a person using the handle “Open Thread Guy” makes new open threads in posts/discussions periodically.
2. Meet the quality standard norms in both posts/discussions and articles/main.
Many members expect all submissions in posts/discussions and articles/main to be well-written and they have very high standards for this. In addition to desiring good spelling and grammar, they also like to see that you’ve done your homework. They like to see references, mathematical equations, graphs, vocabulary terms and want you to show familiarity with the subculture. They really do not like seeing authors make mistakes that seem to pattern match to errors like cognitive biases, logical fallacies or other errors. The standards for articles/main are higher than the standards for posts/discussions but standards for posts/discussions are still significantly higher than the standards you typically see on the Internet for message board posts.
3. Write something of quality, even when commenting.
Members have high standards for comments as well, behaving as if they want the entire page, comments included, to be full of new information that is well-presented, well-reasoned, correctly spelled, etc. The standard for posts/discussions is higher than the standard for comments, but the standard for comments is still significantly higher than the standards you typically see on the Internet for message board comments.
4. Your professional face will probably fare better than your casual face.
LessWrong members do not treat the website as a fun hangout, a joke site, or an emotional support forum. To blend in on the website, the best thing you can do is to behave more or less the way you would for a professional endeavor. Expect to do some learning before the others will accept you. Brush up on cognitive biases and logical fallacies or you will quickly be viewed as “irrational”. Read about the subculture so that you can anticipate the way that people will interpret your words, how they will react to your ideas and so that you can work with these interpretations and reactions intelligently. One exception is that anonymous Internet handles are viewed as perfectly acceptable.
5. Don’t expect people to be perfect rationalists, not even yourself.
Above all, remember that nobody is a perfect rationalist. You’re going to make mistakes, and you’re going to find reasoning errors that other members have made. You may not be able to fix other people’s irrationality, but you can keep an eye out for your own mistakes. None of us were taught to think rationally in school, and we’ve all been steeped in beliefs that were grown, defended, selected and mutated by countless irrational decision-makers. Becoming a group of perfect rationalists would take a very long time and may not be a realistic goal. Our common goal is to refine ourselves to become less and less wrong by working together. If you get the urge to tear someone’s reputation to little bits, please remember this: We’ve all inherited quite the ideological mess, and we’re all working on this mess together. Don’t expect others to be perfect rationalists. They can’t be perfect but most of them do desire to be more rational.
6. Don’t help us be less wrong too much.
Although it can be, for a variety of reasons, extremely tempting to go around telling people that they’re wrong or starting debates, you should be aware that this behavior is likely to be interpreted as status seeking. Many members frown on social status games. Maybe you feel motivated by some form of altruism along the lines of Randall Monroe’s call to “duty” to step in because “Someone is wrong on the Internet.” and you want them to be right. Maybe you really do enjoy showing off while making other people feel publicly humiliated. Regardless of whether your motives are altruistic, selfish or otherwise, please be aware that behaviors that seem similar to these are likely to be perceived as part of a social status game, an attack or trolling. LessWrong members are of course interested in learning from their mistakes, but they’re also human. If you say things that could insult them, many will feel and/or behave the way that insulted humans do. Simply put: this is one of the fastest ways to make yourself unpopular. If you want to increase your status, consider this research instead: Political Skills which Increase Income
I think it’s good that a page like this exists; I’d want to be able to use it as a go-to link when suggesting people engage with or post on LessWrong, e.g. in my post on Notes on EA-related research, writing, testing fit, learning, and the Forum.
Unfortunately, it seems to me that this page isn’t well suited to that purpose. Here are some things that seem like key issues to me (maybe other people would disagree):
This introduction seems unnecessarily intimidating, non-welcoming, and actually (in my perception) somewhat arrogant. For example:
“If you have no familiarity with the cultural articles and other themes before you begin interacting, your social experiences are likely to be highly awkward. The rationalist way of thinking and subculture is extremely, extremely complex. To give you a gist of how complex it is and what kind of complexity you’ll encounter:”
This feels to me like saying “We’re very special and you need to do your homework to deeply understand us before interacting at all with us, or you’re just wasting our time and we’ll want you to go away.”
I do agree that the rationalist culture can take some getting used to, but I don’t think it’s far more complex or unusual than the cultures in a wide range of other subcultures, and I think it’s very often easiest to get up to speed with a culture partly just by interacting with it.
I do agree that reading parts of the Sequences is useful, and that it’s probably good to gently encourage new users to do that. But I wouldn’t want to make it sound like it’s a hard requirement or like they have to read the whole thing. And this passage will probably cause some readers to infer that, even if it doesn’t outright say it. (A lot of people lurk more than they should, have imposter syndrome, etc.)
I started interacting on LessWrong before having finished the Sequences (though I’d read some), and I think I both got and provided value from those interactions.
Part of this is just my visceral reaction to any group saying their way of thinking and subculture is “extremely, extremely complex”, rather than me having explicit reasons to think that that’s bad.
I wrote all of that before reading the next paragraphs, and the next paragraphs very much intensified my emotional feeling of “These folks seem really arrogant and obnoxious and I don’t want to ever hang out with them”
This is despite the fact that I’ve actually engaged a lot on LessWrong, really value a lot about it, rank the Sequences and HPMOR as among my favourite books, etc.
Maybe part of this is that this is describing what rationalists aim to be as if all rationalists always hit that mark.
Rationalists and the rationalist community often do suffer from the same issues other people and communities do. This was in fact one of the really valuable things Eliezer’s posts pointed out (e.g., being wary of trending towards cult-hood).
Again, these are just my perceptions. But FWIW, I do feel these things quite strongly.
Here are a couple much less important issues:
I don’t think I’d characterise the Sequences as “mostly like Kahneman, but more engaging, and I guess with a bit of AI etc.” From memory, a quite substantial chunk of the sequences—and quite a substantial chunk of their value—had to do with things other than cognitive biases, e.g. what goals one should form, why, how to act on them, etc. Maybe this is partly a matter of instrumental rather than just epistemic rationality.
Relatedly, I think this page presents a misleading or overly narrow picture of what’s distinctive (and good!) about rationalist approaches to forming beliefs and choosing decisions when it says “There are over a hundred cognitive biases that humans are affected by that rationalists aim to avoid. Imagine you added over one hundred improvements to your way of thinking.”
“Kahneman is notoriously dry” feels like an odd thing to say. Maybe he is, but I’ve never actually heard anyone say this, and I’ve read one of his books and papers and watched one of his talks and found them all probably somewhat more engaging than similar things from the average scientist. (Though maybe this was more the ideas themselves, rather than the presentation.)
(I didn’t read “Website Participation Intro or “Why am I being downvoted?”″, because it was unfortunately already clear that I wouldn’t want to link to this page when aiming to introduce people to LessWrong and encourage them to read, comment, and/or post there.)
Hey, sorry that you came across this instead of the current welcome/about page. I agree with much of your feedback here, glad the Welcome/About page does meet the need.
I added a note to this page saying it was written in 2015 (by one particular user, as you’ll see in the history). So we’ve got it for historical reasons, but I also wouldn’t use it as an intro.
(Update: I just saw the post Welcome to LessWrong!, and I think that that serves my needs well.)