In general, imagine that you have a website about “X” (whether X is rationality or StarCraft; the mechanism is the same). Quite likely, a distribution of people who visit the website (let’s assume the days of highest glory of Less Wrong) will be something like this:
10 people who are quite obsessed about “X” (people who dramatically changed their lives after doing some strategic thinking; or people who participate successfully in StarCraft competitions).
100 people who are moderately interested in “X” (people who read some parts of the Sequences and perhaps changed a habit or two; or people who once in a while play StarCraft with their friends).
1000 people who are merely interested in “X” as a topic of conversation (people who read Dan Ariely and Malcolm Gladwell, and mostly read Less Wrong to find cool things they could mention in a debate on similar topics; people who sometimes watch a StarCraft video on YouTube, but actually didn’t play it for months).
Now you are doing a survey about whether the readers of the website somehow differ from the general population. I would expect that those 10 obsessed ones do, but those 1000 recreational readers don’t. If you put them both in the same category, the obsessed ones make only 1% in the category, so whatever are their special traits, they will disappear in the whole.
For example (completely made up numbers here), let’s assume that an average person has a 1% probability of becoming a vegetarian, those 1000 recreational LW readers also have a 1% probability, the 100 moderate LW readers have probability 2%, and the hardcore ones have a probability of 20% (that would be a huge difference compared with the average population). Add them all together, you have 1110 people, of whom 0.01 × 1000 + 0.02 × 100 + 0.2 × 10 = 14 vegetarians; that means 1.26% of the LW readers—almost the same as the 1% of the general population.
This is further complicated by the fact that you can more easily select professional StarCraft players (e.g. by asking whether they participated in some competition, and what is their ranking), but it’s more difficult to tell who is a “hardcore rationalist”. Just spending a lot of time debating on LW (which pretty much guarantees high karma), or having read the whole Sequences doesn’t necessarily mean anything. But this now feels like talking about “true Scotsmen”. Also, there are various status reasons why people may or may not want to identify as “rationalists”.
just because people know something is the right thing to do, it doesn’t mean they will automatically start doing it!
Really? Why not though?
That’s kinda one of the central points of this website. Humans are not automatically strategic, because evolution merely made us execute adaptations, some of which were designed to impress other people rather than to actually change things.
People are stupid, including the smartest ones. Including you and me. Research this thoroughly and cry in despair… then realize you have something to protect, stand up and become stronger. (If these links are new for you, you may want to read the LW Sequences.)
Just look at yourself—are you doing the literally best thing you could do (with the resources you have)? If not, how large is the difference between what you are actually doing, and the literally best thing you could do? For myself, the answer to this is quite depressing. Considering this, why should I expect other people to do better?
In my bubble of local
I guess I just missed it.
Statistically you are quite likely to be at a different part of the planet, so it’s quite easy to miss my local group. ;) Maybe finding a LW meetup nearest to your place could help you find someone like that. (But even within the meetup I would expect that only a few people really try to improve their reasoning, and most are there mostly for social reasons. That’s okay, as long as you can identify the hardcore ones.)
These responses are great compared to the usual yelling match … anywhere else.
In general, imagine that you have a website about “X” (whether X is rationality or StarCraft; the mechanism is the same). Quite likely, a distribution of people who visit the website (let’s assume the days of highest glory of Less Wrong) will be something like this:
10 people who are quite obsessed about “X” (people who dramatically changed their lives after doing some strategic thinking; or people who participate successfully in StarCraft competitions).
100 people who are moderately interested in “X” (people who read some parts of the Sequences and perhaps changed a habit or two; or people who once in a while play StarCraft with their friends).
1000 people who are merely interested in “X” as a topic of conversation (people who read Dan Ariely and Malcolm Gladwell, and mostly read Less Wrong to find cool things they could mention in a debate on similar topics; people who sometimes watch a StarCraft video on YouTube, but actually didn’t play it for months).
Now you are doing a survey about whether the readers of the website somehow differ from the general population. I would expect that those 10 obsessed ones do, but those 1000 recreational readers don’t. If you put them both in the same category, the obsessed ones make only 1% in the category, so whatever are their special traits, they will disappear in the whole.
For example (completely made up numbers here), let’s assume that an average person has a 1% probability of becoming a vegetarian, those 1000 recreational LW readers also have a 1% probability, the 100 moderate LW readers have probability 2%, and the hardcore ones have a probability of 20% (that would be a huge difference compared with the average population). Add them all together, you have 1110 people, of whom 0.01 × 1000 + 0.02 × 100 + 0.2 × 10 = 14 vegetarians; that means 1.26% of the LW readers—almost the same as the 1% of the general population.
This is further complicated by the fact that you can more easily select professional StarCraft players (e.g. by asking whether they participated in some competition, and what is their ranking), but it’s more difficult to tell who is a “hardcore rationalist”. Just spending a lot of time debating on LW (which pretty much guarantees high karma), or having read the whole Sequences doesn’t necessarily mean anything. But this now feels like talking about “true Scotsmen”. Also, there are various status reasons why people may or may not want to identify as “rationalists”.
That’s kinda one of the central points of this website. Humans are not automatically strategic, because evolution merely made us execute adaptations, some of which were designed to impress other people rather than to actually change things.
People are stupid, including the smartest ones. Including you and me. Research this thoroughly and cry in despair… then realize you have something to protect, stand up and become stronger. (If these links are new for you, you may want to read the LW Sequences.)
Just look at yourself—are you doing the literally best thing you could do (with the resources you have)? If not, how large is the difference between what you are actually doing, and the literally best thing you could do? For myself, the answer to this is quite depressing. Considering this, why should I expect other people to do better?
Statistically you are quite likely to be at a different part of the planet, so it’s quite easy to miss my local group. ;) Maybe finding a LW meetup nearest to your place could help you find someone like that. (But even within the meetup I would expect that only a few people really try to improve their reasoning, and most are there mostly for social reasons. That’s okay, as long as you can identify the hardcore ones.)
Oh, I remember this feeling when I found LW!
Thank you for such a clear response and the additional info! :) I have read most of the sequences but some of those links are new to me.