Thank you for the detailed reply. I’m not going to reply point by point because you made a lot of points, but also because I don’t disagree with a lot of it. I do want to offer a couple of intuitions that run counter to your pessimism.
While you’re right that we shouldn’t expect Rationalists to be 10x better at starting companies because of efficient markets, the same is not true of things that contribute to personal happiness. For example: how many people have a strong incentive in helping you build fulfilling romantic relationships? Not the government, not capitalism, not most of your family or friends, often not even your potential partners. Even dating apps make money when you *don’t* successfully seduce your soulmate. But Rationality can be a huge help: learning that your emotions are information, learning about biases and intuitions, learning about communication styles, learning to take 5-minute timers to make plans — all of those can 10x your romantic life.
Going back to efficient markets, I get the sense that a lot of things out there are designed by the 1% most intelligent and ruthless people to take advantage of the 95% and their psychological biases. Outrage media, predatory finance, conspicuous brand consumption and other expensive status ladders, etc. Rationality doesn’t help me design a better YouTube algorithm or finance scam, but at least it allows me to escape the 95% and keeps me away from outrage and in index funds.
Finally, I do believe that the world is getting weirder faster, and the thousands of years of human tradition are becoming obsolete at a faster pace. We are moving ever further from our “design specs”. In this weirding world, I already hit jackpot with Bitcoin and polyamory, two things that couldn’t really exist successfully 100 years ago. Rationality guided me to both. You hit jackpot with blogging— can you imagine your great grand uncle telling you that you’ll become a famous intellectual by writing about cactus people and armchair sociology for free? And we’re both still very young.
For any particular achievement like basketball or making your first million, there are more dedicated practices that help you to your goal faster than Rationality. But for taking advantage of unknown unknowns, the only two things I know that work are Rationality and making friends.
I think efficient market doesn’t just suggest we can’t do much better at starting companies. It also means we can’t do much better at providing self-help, which is a service that can make people lots of money and status if they do it well.
I’m not sure if you’re using index fund investing as an example of rationalist self-help, or just as a metaphor for it. If you’re using it an example, I worry that your standards are so low that almost any good advice could be rationalist self-help. I think if you’re from a community where you didn’t get a lot of good advice, being part of the rationalist community can be really helpful in exposing you to it (sort of like the theory where college makes you successful because it inducts you into the upper-middle class). I think I got most of my “invest in index funds” level good advice before entering the rationalist community, so I didn’t count that.
Being part of the rationalist community has definitely improved my life, partly through giving me better friends and partly through giving me access to good ideas of the “invest in index funds” level. I hadn’t counted that as part of our discussion, but if I do, then I agree it is great. My archetypal idea of “rationalist self-help” is sitting around at a CFAR workshop trying very hard to examine your mental blocks. I’m not sure if we agree on that or if I’m caricaturing your position.
I’m not up for any gigantic time commitment, but if you want to propose some kind of rationalist self-help exercise that I should try (of the order of 10 minutes/day for a few weeks) to see if I change my mind about it, I’m up for that, though I would also believe you if you said such a halfhearted commitment wouldn’t be a good test.
I have several friends in New York who are a match to my Rationalist friends in age, class, intelligence etc. and who:
Pick S&P 500 stocks based on CNBC and blogs because their intuition tells them they’ve beat the market (but they don’t check or track it, just remember the winners).
Stay in jobs they hate because they don’t have a robust decision process for making such a switch (I used goal factoring, Yoda timer job research, and decision matrices to decide where to work).
Go so back asswards about dating that it hurts to watch (because they can’t think about it systematically).
Retweet Trump with comment.
Throw the most boring parties.
Spend thousands of dollars on therapists but would never do a half-hour debugging session with a friend because “that would be weird”.
In general, live mostly within “social reality” where the only question is “is this weird/acceptable” and never “is this true/false”.
Now perhaps Rationalist self-improvement can’t help them, but if you’re reading LessWrong you may be someone who may snap out of social reality long enough for Rationality to change your life significantly.
> if you want to propose some kind of rationalist self-help exercise that I should try
Different strokes for different folks. You can go through alkjash’s Hammertime Sequence and pick one, although even there the one that he rates lowest (goal factoring) is the one that was the most influential in my own life. You must be friends with CFAR instructors/mentors who know your personality and pressing issues better than I do and can recommend and teach a useful exercise.
Agreed, I see a major problem with an argument that seems to imply that since advice exists elsewhere/wasn’t invented by rationality techniques, a meta-heuristic for aggregating trustworthy sources isn’t hugely valuable.
In general, live mostly within “social reality” where the only question is “is this weird/acceptable” and never “is this true/false”.
It seems to me like people who primarily think in terms of weird/acceptable never join the rationality in the first place. Or do you believe that our community has taught people who used to think in those terms to think otherwise?
As I said, someone who is 100% in thrall to social reality will probably not be reading this. But once you peek outside the bubble there is still a long way to enlightenment: first learning how signaling, social roles, tribal impulses etc. shape your behavior so you can avoid their worst effects, then learning to shape the rules of social reality to suit your own goals. Our community is very helpful for getting the first part right, it certainly has been for me. And hopefully we can continue fruitfully exploring the second part too.
It also means we can’t do much better at providing self-help, which is a service that can make people lots of money and status if they do it well.
Maybe the incentives are all wrong here, and the most profitable form of “self-help” is one that doesn’t provide long term improvement, so that customers return for more and more books and seminars.
In that case, we can easily do better—better for the “customers”, but less profitable for the “gurus”.
through giving me access to good ideas of the “invest in index funds” level
for me the point is about getting *consistently* good ideas, getting reliable ideas where applying scientific method is too hard. It is much less about self-improvement as it is about community improvement in the face of more and more connected (and thus weird) world. Rationality is epistemology for the internet era.
Thank you for the detailed reply. I’m not going to reply point by point because you made a lot of points, but also because I don’t disagree with a lot of it. I do want to offer a couple of intuitions that run counter to your pessimism.
While you’re right that we shouldn’t expect Rationalists to be 10x better at starting companies because of efficient markets, the same is not true of things that contribute to personal happiness. For example: how many people have a strong incentive in helping you build fulfilling romantic relationships? Not the government, not capitalism, not most of your family or friends, often not even your potential partners. Even dating apps make money when you *don’t* successfully seduce your soulmate. But Rationality can be a huge help: learning that your emotions are information, learning about biases and intuitions, learning about communication styles, learning to take 5-minute timers to make plans — all of those can 10x your romantic life.
Going back to efficient markets, I get the sense that a lot of things out there are designed by the 1% most intelligent and ruthless people to take advantage of the 95% and their psychological biases. Outrage media, predatory finance, conspicuous brand consumption and other expensive status ladders, etc. Rationality doesn’t help me design a better YouTube algorithm or finance scam, but at least it allows me to escape the 95% and keeps me away from outrage and in index funds.
Finally, I do believe that the world is getting weirder faster, and the thousands of years of human tradition are becoming obsolete at a faster pace. We are moving ever further from our “design specs”. In this weirding world, I already hit jackpot with Bitcoin and polyamory, two things that couldn’t really exist successfully 100 years ago. Rationality guided me to both. You hit jackpot with blogging— can you imagine your great grand uncle telling you that you’ll become a famous intellectual by writing about cactus people and armchair sociology for free? And we’re both still very young.
For any particular achievement like basketball or making your first million, there are more dedicated practices that help you to your goal faster than Rationality. But for taking advantage of unknown unknowns, the only two things I know that work are Rationality and making friends.
Thanks, all good points.
I think efficient market doesn’t just suggest we can’t do much better at starting companies. It also means we can’t do much better at providing self-help, which is a service that can make people lots of money and status if they do it well.
I’m not sure if you’re using index fund investing as an example of rationalist self-help, or just as a metaphor for it. If you’re using it an example, I worry that your standards are so low that almost any good advice could be rationalist self-help. I think if you’re from a community where you didn’t get a lot of good advice, being part of the rationalist community can be really helpful in exposing you to it (sort of like the theory where college makes you successful because it inducts you into the upper-middle class). I think I got most of my “invest in index funds” level good advice before entering the rationalist community, so I didn’t count that.
Being part of the rationalist community has definitely improved my life, partly through giving me better friends and partly through giving me access to good ideas of the “invest in index funds” level. I hadn’t counted that as part of our discussion, but if I do, then I agree it is great. My archetypal idea of “rationalist self-help” is sitting around at a CFAR workshop trying very hard to examine your mental blocks. I’m not sure if we agree on that or if I’m caricaturing your position.
I’m not up for any gigantic time commitment, but if you want to propose some kind of rationalist self-help exercise that I should try (of the order of 10 minutes/day for a few weeks) to see if I change my mind about it, I’m up for that, though I would also believe you if you said such a halfhearted commitment wouldn’t be a good test.
I have several friends in New York who are a match to my Rationalist friends in age, class, intelligence etc. and who:
Pick S&P 500 stocks based on CNBC and blogs because their intuition tells them they’ve beat the market (but they don’t check or track it, just remember the winners).
Stay in jobs they hate because they don’t have a robust decision process for making such a switch (I used goal factoring, Yoda timer job research, and decision matrices to decide where to work).
Go so back asswards about dating that it hurts to watch (because they can’t think about it systematically).
Retweet Trump with comment.
Throw the most boring parties.
Spend thousands of dollars on therapists but would never do a half-hour debugging session with a friend because “that would be weird”.
In general, live mostly within “social reality” where the only question is “is this weird/acceptable” and never “is this true/false”.
Now perhaps Rationalist self-improvement can’t help them, but if you’re reading LessWrong you may be someone who may snap out of social reality long enough for Rationality to change your life significantly.
> if you want to propose some kind of rationalist self-help exercise that I should try
Different strokes for different folks. You can go through alkjash’s Hammertime Sequence and pick one, although even there the one that he rates lowest (goal factoring) is the one that was the most influential in my own life. You must be friends with CFAR instructors/mentors who know your personality and pressing issues better than I do and can recommend and teach a useful exercise.
Agreed, I see a major problem with an argument that seems to imply that since advice exists elsewhere/wasn’t invented by rationality techniques, a meta-heuristic for aggregating trustworthy sources isn’t hugely valuable.
It seems to me like people who primarily think in terms of weird/acceptable never join the rationality in the first place. Or do you believe that our community has taught people who used to think in those terms to think otherwise?
As I said, someone who is 100% in thrall to social reality will probably not be reading this. But once you peek outside the bubble there is still a long way to enlightenment: first learning how signaling, social roles, tribal impulses etc. shape your behavior so you can avoid their worst effects, then learning to shape the rules of social reality to suit your own goals. Our community is very helpful for getting the first part right, it certainly has been for me. And hopefully we can continue fruitfully exploring the second part too.
What is the error that you’re implying here?
Could be a don’t feed the troll error.
Maybe the incentives are all wrong here, and the most profitable form of “self-help” is one that doesn’t provide long term improvement, so that customers return for more and more books and seminars.
In that case, we can easily do better—better for the “customers”, but less profitable for the “gurus”.
for me the point is about getting *consistently* good ideas, getting reliable ideas where applying scientific method is too hard. It is much less about self-improvement as it is about community improvement in the face of more and more connected (and thus weird) world. Rationality is epistemology for the internet era.