Less Wrong currently represents a tiny, tiny, tiny segment of the population. In its current form, it might only appeal to a tiny, tiny segment of the population. Basically, the people who have a strong need forcognition, who are INTx on the Myers-Briggs (65% of us as per 2012 survey data), etc.
Raising the sanity waterline seems like a generally good idea. Smart people who believe stupid things, and go on to invest resources in stupid ways because of it, are frustrating. Trying to learn rationality skills in my 20s, when a bunch of thought patterns are already overlearned, is even more frustrating.
I have an intuition that a better future would be one where the concept of rationality (maybe called something different, but the same idea) is normal. Where it’s as obvious as the idea that you shouldn’t spend more money than you earn, or that you should live a healthy lifestyle, etc. The point isn’t that everyone currently lives debt-free, eats decently well and exercises; that isn’t the case; but they are normal things to do if you’re a minimally proactive person who cares a bit about your future. No one has ever told me that doing taekwondo to stay fit is weird and culty, or that keeping a budget will make me unhappy because I’m overthinking thing.
I think the questions of “whether we should try to do this” and “if so, how do we do it in practice?” are both valuable to discuss, and interesting.
Is making rationality general-interest a good goal?
My intuitions are far from 100% reliable. I can think of a few reasons why this might be a bad idea:
1. A little bit of rationality can be damaging; it might push people in the direction of too much contrarianism, or something else I haven’t thought of. Since introspection is imperfect, knowing a bit about cognitive biases and the mistakes that other people make might make people actually less likely to change their mind–they see other people making those well-known mistakes, but not themselves. Likewise, rationality taught only as a tool or skill, without any kind of underlying philosophy of why you should want to believe true things, might cause problems of a similar nature to martial art skills taught without the traditional, often non-violent philosophies–it could result in people abusing the skill to win fights/debates, making the larger community worse off overall. (Credit to Yan Zhang for martial arts metaphor).
2. Making the concepts general-interest, or just growing too fast, might involve watering them down or changing them in some way that the value of the LW microcommunity is lost. This could be worse for the people who currently enjoy LW even if it isn’t worse overall. I don’t know how easy it would be to avoid, or whether
3. It turns out that rationalists don’t actuallywin, and x-rationality, as Yvain terms it, just isn’t that amazing over-and-above already being proactive and doing stuff like keeping a budget. Yeah, you can say stuff like “the definition of rationality is that it helps you win”, but if in real life, all the people who deliberately try to increase their rationality do worse off overall, by their own standards (or even equally well, but with less time left over for other fun pursuits) than the people who aim for their life goals directly, I want to know that.
4. Making rationality general-interest is a good idea, but not the best thing to be spending time and energy on right now because of Mysterious Reasons X, Y, Z. Maybe I only think it is because of my personal bias towards liking community stuff (and wishing all of my friends were also friends with each other and liked the same activities, which would simplify my social life, but probably shouldn’t happen for good reasons).
Obviously, if any of these are the case, I want to know about it. I also want to know about it if there are other reasons, off my radar, why this is a terrible idea.
What has to change for this to happen?
I don’t really know, or I would be doing those things already (maybe, akrasia allowing). I have some ideas, though.
1. The jargonthing. I’m currently trying to compile a list of LW/CFAR jargon as a project for CFAR, and there are lots of terms I don’t know. There are terms that I’ve realized in retrospect that I was using incorrectly all along. This presents both a large initial effort for someone interested in learning about rationality via the LW route, and also might contribute to the looking-like-a-cult thing.
2. The genderratio thing. This has been discussed before, and it’s a controversial thing to discuss, and I don’t know how much arguing about it in comments will present any solutions. It seems pretty clear that if you want to appeal to the whole population, and a group that represents 50% of the general population only represents 10% of your participants (also as per 2012 survey data, see link above), there’s going to be a problem somewhere down the road.
My data point: as a female on LW, I haven’t experienced any discrimination, and I’m a bit baffled as to why the gender ratio is so skewed in the first place. Then again, I’ve already been through the filter of not caring if I’m the only girl at a meetup group. And I do hang out in female-dominated groups (i.e. the entire field of nursing), and fit in okay, but I’m probably not all that good as a typical example to generalize from.
3. LW currently appeals to intelligent people, or at least people who self-identify as intelligent; according to the 2012 survey data, the self-reported IQ median is 138. This wouldn’t be surprising, and isn’t a problem until you want to appeal to more than 1% of the population. But intelligence and rationality are, in theory, orthogonal, or at least not the same thing. If I suffered a brain injury that reduced my IQ significantly but didn’t otherwise affects my likes and dislikes, I expect I would still be interested in improving my rationality and think it was important, perhaps even more so, but I also think I would find it frustrating. And I might feel horribly out of place.
4. Rationality in general has a bad rap; specifically, the Spock thing. And this isn’t just affecting whether or not people thing Less Wrong the site is weird; it’s affecting whether they want to think about their own decision-making.
This is only what I can think of in 5 minutes...
What’s already happening?
Meetup groups are happening. CFAR is happening. And there are groups out there practicing skills similar or related to rationality, whether or not they call it the same thing.
Conclusion
Rationality, Less Wrong and CFAR have, gradually over the last 2-3 years, become a big part of my life. It’s been fun, and I think it’s made me stronger, and I would prefer a world where as many other people as possible have that. I’d like to know if people think that’s a) a good idea, b) feasible, and c) how to do it practically.
Making Rationality General-Interest
Introduction
Less Wrong currently represents a tiny, tiny, tiny segment of the population. In its current form, it might only appeal to a tiny, tiny segment of the population. Basically, the people who have a strong need for cognition, who are INTx on the Myers-Briggs (65% of us as per 2012 survey data), etc.
Raising the sanity waterline seems like a generally good idea. Smart people who believe stupid things, and go on to invest resources in stupid ways because of it, are frustrating. Trying to learn rationality skills in my 20s, when a bunch of thought patterns are already overlearned, is even more frustrating.
I have an intuition that a better future would be one where the concept of rationality (maybe called something different, but the same idea) is normal. Where it’s as obvious as the idea that you shouldn’t spend more money than you earn, or that you should live a healthy lifestyle, etc. The point isn’t that everyone currently lives debt-free, eats decently well and exercises; that isn’t the case; but they are normal things to do if you’re a minimally proactive person who cares a bit about your future. No one has ever told me that doing taekwondo to stay fit is weird and culty, or that keeping a budget will make me unhappy because I’m overthinking thing.
I think the questions of “whether we should try to do this” and “if so, how do we do it in practice?” are both valuable to discuss, and interesting.
Is making rationality general-interest a good goal?
My intuitions are far from 100% reliable. I can think of a few reasons why this might be a bad idea:
1. A little bit of rationality can be damaging; it might push people in the direction of too much contrarianism, or something else I haven’t thought of. Since introspection is imperfect, knowing a bit about cognitive biases and the mistakes that other people make might make people actually less likely to change their mind–they see other people making those well-known mistakes, but not themselves. Likewise, rationality taught only as a tool or skill, without any kind of underlying philosophy of why you should want to believe true things, might cause problems of a similar nature to martial art skills taught without the traditional, often non-violent philosophies–it could result in people abusing the skill to win fights/debates, making the larger community worse off overall. (Credit to Yan Zhang for martial arts metaphor).
2. Making the concepts general-interest, or just growing too fast, might involve watering them down or changing them in some way that the value of the LW microcommunity is lost. This could be worse for the people who currently enjoy LW even if it isn’t worse overall. I don’t know how easy it would be to avoid, or whether
3. It turns out that rationalists don’t actually win, and x-rationality, as Yvain terms it, just isn’t that amazing over-and-above already being proactive and doing stuff like keeping a budget. Yeah, you can say stuff like “the definition of rationality is that it helps you win”, but if in real life, all the people who deliberately try to increase their rationality do worse off overall, by their own standards (or even equally well, but with less time left over for other fun pursuits) than the people who aim for their life goals directly, I want to know that.
4. Making rationality general-interest is a good idea, but not the best thing to be spending time and energy on right now because of Mysterious Reasons X, Y, Z. Maybe I only think it is because of my personal bias towards liking community stuff (and wishing all of my friends were also friends with each other and liked the same activities, which would simplify my social life, but probably shouldn’t happen for good reasons).
Obviously, if any of these are the case, I want to know about it. I also want to know about it if there are other reasons, off my radar, why this is a terrible idea.
What has to change for this to happen?
I don’t really know, or I would be doing those things already (maybe, akrasia allowing). I have some ideas, though.
1. The jargon thing. I’m currently trying to compile a list of LW/CFAR jargon as a project for CFAR, and there are lots of terms I don’t know. There are terms that I’ve realized in retrospect that I was using incorrectly all along. This presents both a large initial effort for someone interested in learning about rationality via the LW route, and also might contribute to the looking-like-a-cult thing.
2. The gender ratio thing. This has been discussed before, and it’s a controversial thing to discuss, and I don’t know how much arguing about it in comments will present any solutions. It seems pretty clear that if you want to appeal to the whole population, and a group that represents 50% of the general population only represents 10% of your participants (also as per 2012 survey data, see link above), there’s going to be a problem somewhere down the road.
My data point: as a female on LW, I haven’t experienced any discrimination, and I’m a bit baffled as to why the gender ratio is so skewed in the first place. Then again, I’ve already been through the filter of not caring if I’m the only girl at a meetup group. And I do hang out in female-dominated groups (i.e. the entire field of nursing), and fit in okay, but I’m probably not all that good as a typical example to generalize from.
3. LW currently appeals to intelligent people, or at least people who self-identify as intelligent; according to the 2012 survey data, the self-reported IQ median is 138. This wouldn’t be surprising, and isn’t a problem until you want to appeal to more than 1% of the population. But intelligence and rationality are, in theory, orthogonal, or at least not the same thing. If I suffered a brain injury that reduced my IQ significantly but didn’t otherwise affects my likes and dislikes, I expect I would still be interested in improving my rationality and think it was important, perhaps even more so, but I also think I would find it frustrating. And I might feel horribly out of place.
4. Rationality in general has a bad rap; specifically, the Spock thing. And this isn’t just affecting whether or not people thing Less Wrong the site is weird; it’s affecting whether they want to think about their own decision-making.
This is only what I can think of in 5 minutes...
What’s already happening?
Meetup groups are happening. CFAR is happening. And there are groups out there practicing skills similar or related to rationality, whether or not they call it the same thing.
Conclusion
Rationality, Less Wrong and CFAR have, gradually over the last 2-3 years, become a big part of my life. It’s been fun, and I think it’s made me stronger, and I would prefer a world where as many other people as possible have that. I’d like to know if people think that’s a) a good idea, b) feasible, and c) how to do it practically.