Well, there aren’t many things that I don’t doubt a little bit. I don’t think this a bad thing. However, in order to get anything done in life, instead of sitting in my room thinking about how much I don’t know, I have to act on a lot of things that I’m a bit doubtful about.
I thought you doubted it more than a little bit, because you linked to Yvain’s post that says there’s not much evidence. If “a little bit” means, say, 10%, then can you describe the arguments that made you 90% confident?
Yvain said that clarity of mind was one benefit he’d had. I think clarity of mind is awesome and rare, and makes it less likely that people will do stupid things for bad reasons.
I’ve met Yvain and I think he’s fairly awesome. Likewise, of the other LW people I’ve met in real life, they seem disproportionately awesome–they have clarity of mind, yes, but it seems to lead into other things, like doing things differently because you’re actually able to recognize the reason why you were doing non-useful things in the first place. Correlation not causation, of course, and I didn’t know these people 5 years ago, and even if I had, people progress in awesomeness anyway. But still. Correlation = data in a direction, and it’s not the direction of rationality being useless.
In his post Yvain distinguishes between regular rationality, which he thinks a lot of people have, and “x-rationality” that you get from long study of the Sequences’ concepts. I think a lot fewer people have even regular rationality, that it’s a continuum not a divide, and that strategically placed and worded LW concepts could push almost anymore further towards the ‘rationality’ side.
I’ve changed a lot because of my exposure to the rationality community, and in ways that I don’t think I could have attained otherwise. A lot of this is due to clarity of mind–in particular my allowing myself to think thoughts that are embarrassing or otherwise painful. Some of it’s due to specific ideas, like “notice that you’re confused” or “taboo word X”. Some of it’s due to just hanging out with a social group who think differently than my parents. See this post a year and a half ago, and this post from lately.
If such evidence is enough, then rationality would probably recommend you to spread religion instead of rationality :-) Religious people also often talk about how religion gave them wonderful feelings and improved their lives, and there are actual studies showing religious people are happier and healthier.
I feel that you haven’t mentioned an important factor, which is that LW-rationality sounds very attractive in some sense. If that’s correct, then you’re not alone in this, it took me years to learn to honestly subtract that factor.
I feel that you haven’t mentioned an important factor, which is that LW-rationality sounds very attractive in some sense.
Noted. However, before I subtract that factor, I would like to learn whether LW-style rationality seems so attractive for a good reason: long-term, averaged over many people, does it make a difference? It has for a few people. I don’t think you can conclusively say, yet, whether it’s worthwhile teaching to to everyone. In 5-10 years, when CFAR longer-term data starts coming in, then I’ll know. In the meantime, trying to spread it to other people provides data, too.
If it turns out it doesn’t help most people, I won’t keep trying to show it to other people, although I’ll probably try to stick with the community. I would still want to keep looking for something else to try to teach the other people who keep doing stupid things. Call me an idealist...
Religion, AFAICT, does not teach clarity of mind. In many cases it teaches people to follow their intuitions and gut feelings because “God is looking out for them.” This sometimes turns out well for the individual, and sometimes badly (which you’d expect; intuitions are valuable data but can be wrong if the heuristics are applied out of context). Overall I think it’s bad for society, and better if people notice that their hunches are in fact hunches and try to fact-check them. (This isn’t always possible; sometimes you have no outside-view data and you have to go with your gut feeling. But rationality would teach that, too.)
And yeah, I’m picking rationality as a thing to try to spread without having looked at all the possible alternatives. I think that’s okay. There are other people in society who are trying to spread other things for similar reasons. If there are 10 people like me, all with different agendas but for the same reasons, and we’re all paying attention to the data of the next 5 years, and it turns out that one of our methods is actually effective, I would consider that a success. I just don’t know which one of the 10 people I am yet. (If I meet one of the others, and they convince me their agenda has a higher chance of success, I would think about it and then probably agree to help them.)
Is it just me, or does your comment sound like a retreat from “we need to spread rationality because it’s a good idea” to “we need to spread rationality to figure out if it’s a good idea”?
If yes, then note that LW has existed for years and has thousands of users. Yvain was among the first contributors to LW and his early posts were already excellent. Many other good contributors, like Wei Dai (invented UDT, independently invented cryptocurrency) or Paul Christiano (IMO participant), were also good before they joined… As Yvain’s post said, it seems hard to find people who benefited a lot from LW-rationality.
I’m not sure we need more information about the usefulness of LW-rationality before we can make a conclusion. We already have a lot of evidence pointing one way, look at all the LWers who didn’t benefit. Besides, what makes you think that a study with more participants and longer duration would give different results? If anything, it’s probably going to be closer to the mean, because LW folks are self-selected, not randomly selected from the population.
As Yvain’s post said, it seems hard to find people who benefited a lot from LW-rationality.
I think LW has at least made me better at handling disagreements with others. For example I’m rather embarrassed when I look back on my early discussions with Nick Szabo on cryptocurrency and other topics, and I think a disagreement I had a few years ago with my business partner was also helped greatly by both of us having followed LW (or maybe it was still OB at that point).
I would say that rationality is worth trying to spread because it may be a good idea, and because it’s something I know about and can think and plan about. Do you know of another community that has a similar level of development to LW (i.e. fairly cohesive but still quite obscure) that I should also investigate? (AFAIK, CFAR is looking for such organizations for new ideas anyway.)
Also, I’m going to update from your comment in the direction of rationality outreach turning out not being the best use of my time.
For a while I satisfied my idea-spreading urges by teaching math to talented kids on a volunteer basis. If you’re very good at something (e.g. swimming), you could try teaching that, it’s a lot of fun.
Or you could spend some effort on figuring out how to measure rationality and check if someone is making progress. That’s much harder though, once you get past the obvious wrong answers like “give them a multiple choice test about rationality”. Eliezer and Anna have written a lot about this problem.
I do teach swimming; I did for many years as a job, and now I do it for fun (and for free) to the kids of my friends (and several of the CFAR staff when I was in San Francisco). It’s something I’m very good at (I may be more at teaching swimming to others than swimming myself), and it fulfills an urge, but not the idea-spreading one.
If CFAR is looking for help trying to make a rationality test, I would be happy to help, too...
Well, if the criteria for success is inventing cryptocurrency, I don’t predict that teaching rationality will have that effect on people. It’s a lot more small usefulness that compounds over time. So understanding Bayes makes it easier to assemble what you know coherently, learning to install habits helps you remember to use the skill when you’re most likely to need it… etc. That habit of reasoning might save you money, or social capital, or time. And, over the course of your life, it gives you more time and scope to act.
That’s pretty much what it does for me, so far, and it’s been a worthwhile level up. It did make a difference for me to learn and practice in a community (built in spaced repetition, yay!) rather than just reading. The reading helped, but once I have a tool, it takes practice to remember to use it, instead of my old default.
Well, there aren’t many things that I don’t doubt a little bit. I don’t think this a bad thing. However, in order to get anything done in life, instead of sitting in my room thinking about how much I don’t know, I have to act on a lot of things that I’m a bit doubtful about.
I thought you doubted it more than a little bit, because you linked to Yvain’s post that says there’s not much evidence. If “a little bit” means, say, 10%, then can you describe the arguments that made you 90% confident?
Yvain said that clarity of mind was one benefit he’d had. I think clarity of mind is awesome and rare, and makes it less likely that people will do stupid things for bad reasons.
I’ve met Yvain and I think he’s fairly awesome. Likewise, of the other LW people I’ve met in real life, they seem disproportionately awesome–they have clarity of mind, yes, but it seems to lead into other things, like doing things differently because you’re actually able to recognize the reason why you were doing non-useful things in the first place. Correlation not causation, of course, and I didn’t know these people 5 years ago, and even if I had, people progress in awesomeness anyway. But still. Correlation = data in a direction, and it’s not the direction of rationality being useless.
In his post Yvain distinguishes between regular rationality, which he thinks a lot of people have, and “x-rationality” that you get from long study of the Sequences’ concepts. I think a lot fewer people have even regular rationality, that it’s a continuum not a divide, and that strategically placed and worded LW concepts could push almost anymore further towards the ‘rationality’ side.
I’ve changed a lot because of my exposure to the rationality community, and in ways that I don’t think I could have attained otherwise. A lot of this is due to clarity of mind–in particular my allowing myself to think thoughts that are embarrassing or otherwise painful. Some of it’s due to specific ideas, like “notice that you’re confused” or “taboo word X”. Some of it’s due to just hanging out with a social group who think differently than my parents. See this post a year and a half ago, and this post from lately.
If such evidence is enough, then rationality would probably recommend you to spread religion instead of rationality :-) Religious people also often talk about how religion gave them wonderful feelings and improved their lives, and there are actual studies showing religious people are happier and healthier.
I feel that you haven’t mentioned an important factor, which is that LW-rationality sounds very attractive in some sense. If that’s correct, then you’re not alone in this, it took me years to learn to honestly subtract that factor.
Noted. However, before I subtract that factor, I would like to learn whether LW-style rationality seems so attractive for a good reason: long-term, averaged over many people, does it make a difference? It has for a few people. I don’t think you can conclusively say, yet, whether it’s worthwhile teaching to to everyone. In 5-10 years, when CFAR longer-term data starts coming in, then I’ll know. In the meantime, trying to spread it to other people provides data, too.
If it turns out it doesn’t help most people, I won’t keep trying to show it to other people, although I’ll probably try to stick with the community. I would still want to keep looking for something else to try to teach the other people who keep doing stupid things. Call me an idealist...
Religion, AFAICT, does not teach clarity of mind. In many cases it teaches people to follow their intuitions and gut feelings because “God is looking out for them.” This sometimes turns out well for the individual, and sometimes badly (which you’d expect; intuitions are valuable data but can be wrong if the heuristics are applied out of context). Overall I think it’s bad for society, and better if people notice that their hunches are in fact hunches and try to fact-check them. (This isn’t always possible; sometimes you have no outside-view data and you have to go with your gut feeling. But rationality would teach that, too.)
And yeah, I’m picking rationality as a thing to try to spread without having looked at all the possible alternatives. I think that’s okay. There are other people in society who are trying to spread other things for similar reasons. If there are 10 people like me, all with different agendas but for the same reasons, and we’re all paying attention to the data of the next 5 years, and it turns out that one of our methods is actually effective, I would consider that a success. I just don’t know which one of the 10 people I am yet. (If I meet one of the others, and they convince me their agenda has a higher chance of success, I would think about it and then probably agree to help them.)
Is it just me, or does your comment sound like a retreat from “we need to spread rationality because it’s a good idea” to “we need to spread rationality to figure out if it’s a good idea”?
If yes, then note that LW has existed for years and has thousands of users. Yvain was among the first contributors to LW and his early posts were already excellent. Many other good contributors, like Wei Dai (invented UDT, independently invented cryptocurrency) or Paul Christiano (IMO participant), were also good before they joined… As Yvain’s post said, it seems hard to find people who benefited a lot from LW-rationality.
I’m not sure we need more information about the usefulness of LW-rationality before we can make a conclusion. We already have a lot of evidence pointing one way, look at all the LWers who didn’t benefit. Besides, what makes you think that a study with more participants and longer duration would give different results? If anything, it’s probably going to be closer to the mean, because LW folks are self-selected, not randomly selected from the population.
I think LW has at least made me better at handling disagreements with others. For example I’m rather embarrassed when I look back on my early discussions with Nick Szabo on cryptocurrency and other topics, and I think a disagreement I had a few years ago with my business partner was also helped greatly by both of us having followed LW (or maybe it was still OB at that point).
I would say that rationality is worth trying to spread because it may be a good idea, and because it’s something I know about and can think and plan about. Do you know of another community that has a similar level of development to LW (i.e. fairly cohesive but still quite obscure) that I should also investigate? (AFAIK, CFAR is looking for such organizations for new ideas anyway.)
Also, I’m going to update from your comment in the direction of rationality outreach turning out not being the best use of my time.
For a while I satisfied my idea-spreading urges by teaching math to talented kids on a volunteer basis. If you’re very good at something (e.g. swimming), you could try teaching that, it’s a lot of fun.
Or you could spend some effort on figuring out how to measure rationality and check if someone is making progress. That’s much harder though, once you get past the obvious wrong answers like “give them a multiple choice test about rationality”. Eliezer and Anna have written a lot about this problem.
I do teach swimming; I did for many years as a job, and now I do it for fun (and for free) to the kids of my friends (and several of the CFAR staff when I was in San Francisco). It’s something I’m very good at (I may be more at teaching swimming to others than swimming myself), and it fulfills an urge, but not the idea-spreading one.
If CFAR is looking for help trying to make a rationality test, I would be happy to help, too...
Well, if the criteria for success is inventing cryptocurrency, I don’t predict that teaching rationality will have that effect on people. It’s a lot more small usefulness that compounds over time. So understanding Bayes makes it easier to assemble what you know coherently, learning to install habits helps you remember to use the skill when you’re most likely to need it… etc. That habit of reasoning might save you money, or social capital, or time. And, over the course of your life, it gives you more time and scope to act.
That’s pretty much what it does for me, so far, and it’s been a worthwhile level up. It did make a difference for me to learn and practice in a community (built in spaced repetition, yay!) rather than just reading. The reading helped, but once I have a tool, it takes practice to remember to use it, instead of my old default.