I think efficient market doesn’t just suggest we can’t do much better at starting companies. It also means we can’t do much better at providing self-help, which is a service that can make people lots of money and status if they do it well.
I’m not sure if you’re using index fund investing as an example of rationalist self-help, or just as a metaphor for it. If you’re using it an example, I worry that your standards are so low that almost any good advice could be rationalist self-help. I think if you’re from a community where you didn’t get a lot of good advice, being part of the rationalist community can be really helpful in exposing you to it (sort of like the theory where college makes you successful because it inducts you into the upper-middle class). I think I got most of my “invest in index funds” level good advice before entering the rationalist community, so I didn’t count that.
Being part of the rationalist community has definitely improved my life, partly through giving me better friends and partly through giving me access to good ideas of the “invest in index funds” level. I hadn’t counted that as part of our discussion, but if I do, then I agree it is great. My archetypal idea of “rationalist self-help” is sitting around at a CFAR workshop trying very hard to examine your mental blocks. I’m not sure if we agree on that or if I’m caricaturing your position.
I’m not up for any gigantic time commitment, but if you want to propose some kind of rationalist self-help exercise that I should try (of the order of 10 minutes/day for a few weeks) to see if I change my mind about it, I’m up for that, though I would also believe you if you said such a halfhearted commitment wouldn’t be a good test.
I have several friends in New York who are a match to my Rationalist friends in age, class, intelligence etc. and who:
Pick S&P 500 stocks based on CNBC and blogs because their intuition tells them they’ve beat the market (but they don’t check or track it, just remember the winners).
Stay in jobs they hate because they don’t have a robust decision process for making such a switch (I used goal factoring, Yoda timer job research, and decision matrices to decide where to work).
Go so back asswards about dating that it hurts to watch (because they can’t think about it systematically).
Retweet Trump with comment.
Throw the most boring parties.
Spend thousands of dollars on therapists but would never do a half-hour debugging session with a friend because “that would be weird”.
In general, live mostly within “social reality” where the only question is “is this weird/acceptable” and never “is this true/false”.
Now perhaps Rationalist self-improvement can’t help them, but if you’re reading LessWrong you may be someone who may snap out of social reality long enough for Rationality to change your life significantly.
> if you want to propose some kind of rationalist self-help exercise that I should try
Different strokes for different folks. You can go through alkjash’s Hammertime Sequence and pick one, although even there the one that he rates lowest (goal factoring) is the one that was the most influential in my own life. You must be friends with CFAR instructors/mentors who know your personality and pressing issues better than I do and can recommend and teach a useful exercise.
Agreed, I see a major problem with an argument that seems to imply that since advice exists elsewhere/wasn’t invented by rationality techniques, a meta-heuristic for aggregating trustworthy sources isn’t hugely valuable.
In general, live mostly within “social reality” where the only question is “is this weird/acceptable” and never “is this true/false”.
It seems to me like people who primarily think in terms of weird/acceptable never join the rationality in the first place. Or do you believe that our community has taught people who used to think in those terms to think otherwise?
As I said, someone who is 100% in thrall to social reality will probably not be reading this. But once you peek outside the bubble there is still a long way to enlightenment: first learning how signaling, social roles, tribal impulses etc. shape your behavior so you can avoid their worst effects, then learning to shape the rules of social reality to suit your own goals. Our community is very helpful for getting the first part right, it certainly has been for me. And hopefully we can continue fruitfully exploring the second part too.
It also means we can’t do much better at providing self-help, which is a service that can make people lots of money and status if they do it well.
Maybe the incentives are all wrong here, and the most profitable form of “self-help” is one that doesn’t provide long term improvement, so that customers return for more and more books and seminars.
In that case, we can easily do better—better for the “customers”, but less profitable for the “gurus”.
through giving me access to good ideas of the “invest in index funds” level
for me the point is about getting *consistently* good ideas, getting reliable ideas where applying scientific method is too hard. It is much less about self-improvement as it is about community improvement in the face of more and more connected (and thus weird) world. Rationality is epistemology for the internet era.
Thanks, all good points.
I think efficient market doesn’t just suggest we can’t do much better at starting companies. It also means we can’t do much better at providing self-help, which is a service that can make people lots of money and status if they do it well.
I’m not sure if you’re using index fund investing as an example of rationalist self-help, or just as a metaphor for it. If you’re using it an example, I worry that your standards are so low that almost any good advice could be rationalist self-help. I think if you’re from a community where you didn’t get a lot of good advice, being part of the rationalist community can be really helpful in exposing you to it (sort of like the theory where college makes you successful because it inducts you into the upper-middle class). I think I got most of my “invest in index funds” level good advice before entering the rationalist community, so I didn’t count that.
Being part of the rationalist community has definitely improved my life, partly through giving me better friends and partly through giving me access to good ideas of the “invest in index funds” level. I hadn’t counted that as part of our discussion, but if I do, then I agree it is great. My archetypal idea of “rationalist self-help” is sitting around at a CFAR workshop trying very hard to examine your mental blocks. I’m not sure if we agree on that or if I’m caricaturing your position.
I’m not up for any gigantic time commitment, but if you want to propose some kind of rationalist self-help exercise that I should try (of the order of 10 minutes/day for a few weeks) to see if I change my mind about it, I’m up for that, though I would also believe you if you said such a halfhearted commitment wouldn’t be a good test.
I have several friends in New York who are a match to my Rationalist friends in age, class, intelligence etc. and who:
Pick S&P 500 stocks based on CNBC and blogs because their intuition tells them they’ve beat the market (but they don’t check or track it, just remember the winners).
Stay in jobs they hate because they don’t have a robust decision process for making such a switch (I used goal factoring, Yoda timer job research, and decision matrices to decide where to work).
Go so back asswards about dating that it hurts to watch (because they can’t think about it systematically).
Retweet Trump with comment.
Throw the most boring parties.
Spend thousands of dollars on therapists but would never do a half-hour debugging session with a friend because “that would be weird”.
In general, live mostly within “social reality” where the only question is “is this weird/acceptable” and never “is this true/false”.
Now perhaps Rationalist self-improvement can’t help them, but if you’re reading LessWrong you may be someone who may snap out of social reality long enough for Rationality to change your life significantly.
> if you want to propose some kind of rationalist self-help exercise that I should try
Different strokes for different folks. You can go through alkjash’s Hammertime Sequence and pick one, although even there the one that he rates lowest (goal factoring) is the one that was the most influential in my own life. You must be friends with CFAR instructors/mentors who know your personality and pressing issues better than I do and can recommend and teach a useful exercise.
Agreed, I see a major problem with an argument that seems to imply that since advice exists elsewhere/wasn’t invented by rationality techniques, a meta-heuristic for aggregating trustworthy sources isn’t hugely valuable.
It seems to me like people who primarily think in terms of weird/acceptable never join the rationality in the first place. Or do you believe that our community has taught people who used to think in those terms to think otherwise?
As I said, someone who is 100% in thrall to social reality will probably not be reading this. But once you peek outside the bubble there is still a long way to enlightenment: first learning how signaling, social roles, tribal impulses etc. shape your behavior so you can avoid their worst effects, then learning to shape the rules of social reality to suit your own goals. Our community is very helpful for getting the first part right, it certainly has been for me. And hopefully we can continue fruitfully exploring the second part too.
What is the error that you’re implying here?
Could be a don’t feed the troll error.
Maybe the incentives are all wrong here, and the most profitable form of “self-help” is one that doesn’t provide long term improvement, so that customers return for more and more books and seminars.
In that case, we can easily do better—better for the “customers”, but less profitable for the “gurus”.
for me the point is about getting *consistently* good ideas, getting reliable ideas where applying scientific method is too hard. It is much less about self-improvement as it is about community improvement in the face of more and more connected (and thus weird) world. Rationality is epistemology for the internet era.