There is usually a distribution of a few “hardcore” members, and many lukewarm ones. In a statistics that includes all of them, the behavior of the hardcore members can easily disappear.
Could you explain this more in depth; I’m failing to grasp this completely. I apologize.
if we ignore the animal suffering
Why would we do that?
Or maybe it’s just that food doesn’t get as high priority as e.g. education, making money, or exercise, so people focus their attention on the other things.
I guess, but you can usually focus on multiple things at once, and most people have certain causes they ascribe to.
Or, most obviously—just because people know something is the right thing to do, it doesn’t mean they will automatically start doing it! Not even if they identify as “rationalists”.
Really? Why not though? All humans, excluding sociopaths, have empathy. I’ll admit I see this a bit though.
In my bubble of local hardcore aspiring rationalists, vegetarianism or veganism is almost the norm.
Oh, hmm I guess I just missed it.
Thank you for your response and your hypotheses! These responses are great compared to the usual yelling match … anywhere else.
In general, imagine that you have a website about “X” (whether X is rationality or StarCraft; the mechanism is the same). Quite likely, a distribution of people who visit the website (let’s assume the days of highest glory of Less Wrong) will be something like this:
10 people who are quite obsessed about “X” (people who dramatically changed their lives after doing some strategic thinking; or people who participate successfully in StarCraft competitions).
100 people who are moderately interested in “X” (people who read some parts of the Sequences and perhaps changed a habit or two; or people who once in a while play StarCraft with their friends).
1000 people who are merely interested in “X” as a topic of conversation (people who read Dan Ariely and Malcolm Gladwell, and mostly read Less Wrong to find cool things they could mention in a debate on similar topics; people who sometimes watch a StarCraft video on YouTube, but actually didn’t play it for months).
Now you are doing a survey about whether the readers of the website somehow differ from the general population. I would expect that those 10 obsessed ones do, but those 1000 recreational readers don’t. If you put them both in the same category, the obsessed ones make only 1% in the category, so whatever are their special traits, they will disappear in the whole.
For example (completely made up numbers here), let’s assume that an average person has a 1% probability of becoming a vegetarian, those 1000 recreational LW readers also have a 1% probability, the 100 moderate LW readers have probability 2%, and the hardcore ones have a probability of 20% (that would be a huge difference compared with the average population). Add them all together, you have 1110 people, of whom 0.01 × 1000 + 0.02 × 100 + 0.2 × 10 = 14 vegetarians; that means 1.26% of the LW readers—almost the same as the 1% of the general population.
This is further complicated by the fact that you can more easily select professional StarCraft players (e.g. by asking whether they participated in some competition, and what is their ranking), but it’s more difficult to tell who is a “hardcore rationalist”. Just spending a lot of time debating on LW (which pretty much guarantees high karma), or having read the whole Sequences doesn’t necessarily mean anything. But this now feels like talking about “true Scotsmen”. Also, there are various status reasons why people may or may not want to identify as “rationalists”.
just because people know something is the right thing to do, it doesn’t mean they will automatically start doing it!
Really? Why not though?
That’s kinda one of the central points of this website. Humans are not automatically strategic, because evolution merely made us execute adaptations, some of which were designed to impress other people rather than to actually change things.
People are stupid, including the smartest ones. Including you and me. Research this thoroughly and cry in despair… then realize you have something to protect, stand up and become stronger. (If these links are new for you, you may want to read the LW Sequences.)
Just look at yourself—are you doing the literally best thing you could do (with the resources you have)? If not, how large is the difference between what you are actually doing, and the literally best thing you could do? For myself, the answer to this is quite depressing. Considering this, why should I expect other people to do better?
In my bubble of local
I guess I just missed it.
Statistically you are quite likely to be at a different part of the planet, so it’s quite easy to miss my local group. ;) Maybe finding a LW meetup nearest to your place could help you find someone like that. (But even within the meetup I would expect that only a few people really try to improve their reasoning, and most are there mostly for social reasons. That’s okay, as long as you can identify the hardcore ones.)
These responses are great compared to the usual yelling match … anywhere else.
Or, most obviously—just because people know something is the right thing to do, it doesn’t mean they will automatically start doing it! Not even if they identify as “rationalists”.
If my worldview was, “animals are inferior and their suffering is irrelevant”.
Wouldn’t that be an irrational ‘axiom’ to start from though? Maybe the inferior part works, but you can’t just say their suffering is irrelevant. If you go off the basis that humans matter just because than that’s a case of special pleading saying humans are better because they are human. There suffering may be less but it isn’t irrelevant because they can suffer.
Could you explain this more in depth; I’m failing to grasp this completely. I apologize.
Why would we do that?
I guess, but you can usually focus on multiple things at once, and most people have certain causes they ascribe to.
Really? Why not though? All humans, excluding sociopaths, have empathy. I’ll admit I see this a bit though.
Oh, hmm I guess I just missed it.
Thank you for your response and your hypotheses! These responses are great compared to the usual yelling match … anywhere else.
In general, imagine that you have a website about “X” (whether X is rationality or StarCraft; the mechanism is the same). Quite likely, a distribution of people who visit the website (let’s assume the days of highest glory of Less Wrong) will be something like this:
10 people who are quite obsessed about “X” (people who dramatically changed their lives after doing some strategic thinking; or people who participate successfully in StarCraft competitions).
100 people who are moderately interested in “X” (people who read some parts of the Sequences and perhaps changed a habit or two; or people who once in a while play StarCraft with their friends).
1000 people who are merely interested in “X” as a topic of conversation (people who read Dan Ariely and Malcolm Gladwell, and mostly read Less Wrong to find cool things they could mention in a debate on similar topics; people who sometimes watch a StarCraft video on YouTube, but actually didn’t play it for months).
Now you are doing a survey about whether the readers of the website somehow differ from the general population. I would expect that those 10 obsessed ones do, but those 1000 recreational readers don’t. If you put them both in the same category, the obsessed ones make only 1% in the category, so whatever are their special traits, they will disappear in the whole.
For example (completely made up numbers here), let’s assume that an average person has a 1% probability of becoming a vegetarian, those 1000 recreational LW readers also have a 1% probability, the 100 moderate LW readers have probability 2%, and the hardcore ones have a probability of 20% (that would be a huge difference compared with the average population). Add them all together, you have 1110 people, of whom 0.01 × 1000 + 0.02 × 100 + 0.2 × 10 = 14 vegetarians; that means 1.26% of the LW readers—almost the same as the 1% of the general population.
This is further complicated by the fact that you can more easily select professional StarCraft players (e.g. by asking whether they participated in some competition, and what is their ranking), but it’s more difficult to tell who is a “hardcore rationalist”. Just spending a lot of time debating on LW (which pretty much guarantees high karma), or having read the whole Sequences doesn’t necessarily mean anything. But this now feels like talking about “true Scotsmen”. Also, there are various status reasons why people may or may not want to identify as “rationalists”.
That’s kinda one of the central points of this website. Humans are not automatically strategic, because evolution merely made us execute adaptations, some of which were designed to impress other people rather than to actually change things.
People are stupid, including the smartest ones. Including you and me. Research this thoroughly and cry in despair… then realize you have something to protect, stand up and become stronger. (If these links are new for you, you may want to read the LW Sequences.)
Just look at yourself—are you doing the literally best thing you could do (with the resources you have)? If not, how large is the difference between what you are actually doing, and the literally best thing you could do? For myself, the answer to this is quite depressing. Considering this, why should I expect other people to do better?
Statistically you are quite likely to be at a different part of the planet, so it’s quite easy to miss my local group. ;) Maybe finding a LW meetup nearest to your place could help you find someone like that. (But even within the meetup I would expect that only a few people really try to improve their reasoning, and most are there mostly for social reasons. That’s okay, as long as you can identify the hardcore ones.)
Oh, I remember this feeling when I found LW!
Thank you for such a clear response and the additional info! :) I have read most of the sequences but some of those links are new to me.
http://lesswrong.com/lw/2p5/humans_are_not_automatically_strategic/
Welcome!
If my worldview was, “animals are inferior and their suffering is irrelevant”.
Wouldn’t that be an irrational ‘axiom’ to start from though? Maybe the inferior part works, but you can’t just say their suffering is irrelevant. If you go off the basis that humans matter just because than that’s a case of special pleading saying humans are better because they are human. There suffering may be less but it isn’t irrelevant because they can suffer.
Why?
Do humans matter? Why do humans matter? I think you might be leaping a conclusion or a few here.