How about: “What is rationality?” and “Will rationality actually help you if you’re not trying to design an AI?”
Don’t get me wrong. I really like LessWrong. I’ve been fairly involved in the Seattle rationality community. Yet, all the same, I can’t help but think that actual rationality hasn’t really helped me all that much in my everyday life. I can point to very few things where I’ve used a Rationality Technique to make a decision, and none of those decisions were especially high-impact.
In my life, rationality has been a hobby. If I weren’t reading the sequences, I’d be arguing about geopolitics, or playing board games. So, to me, the most open question in rationality is, “Why should one bother? What special claim does rationality have over my time and attention that, say, Starcraft does not?”
I keep seeing responses like this, and I don’t understand them at all. Rationality is inseparably intertwined with everything I do. Long before I found LessWrong I strove to be rational because I’m an agent with goals and rationality is a toolset for achieving those goals. Yes, I got a lot of it wrong, but I also got a lot of it right—reading the Sequences felt less like discovering some revelatory holy book and more like refactoring my beliefs.
Afterwards, rationality took less than a year to completely change my life. I don’t study rationality because it’s a hobby; I study it because the level I was at before was insufficient and I can’t afford to make another massive mistake. Sometimes you can’t solve a problem on intuition alone and then you need the big guns—those “rationality techniques”.
And what game have those “big guns” allowed you to bag that the lesser guns of “ordinary common sense” would not have?
There are lots of people who do lots of amazing things without having once read Kahneman, without having once encountered any literature about cognitive biases. If we are proposing that rationality is some kind of special edge that will allow us to accomplish things that other people cannot accomplish, we had better come up with some examples, hadn’t we?
I don’t agree with your dichotomy between rationality techniques and common sense. Common sense is just layman-speak for S1, and S1 can be trained to think rationally. A lot of rationality for me is ingrained into S1 and isn’t something I think about anymore. For example, S1′s response to a math problem is to invoke S2, rather than try to solve it. Why? Because S1 has learned that it cannot solve math problems, even seemingly simple ones. Lightness, precision, and perfectionism are mostly S1 jobs for me as well.
And I’m also not claiming rationality is a prerequisite for victory. Rather, I see it as a power amplifier. If you don’t have any rationality whatsoever, you’re flailing around blind. With a little rationality (maybe just stuff you’ve learned by osmosis) you can chart a rough course, and if you’re already powerful enough that might be all it takes.
But those are relatively minor nitpicks. Let’s talk about how, specifically, rationality has changed my life.
The major one for me is discovering I’m trans. Rationality got me to seriously think about the problem (by telling me that emotions weren’t evil and crazy), and then told me how to decide whether I was actually trans or not (bayesian fermi estimate). It takes many people years or months to figure this out, often with the help of therapists. I did it in a week, alone, and I came out without the doubts and worries that plague normal trans people.
My pre-sequence grasp of rationality was extremely limited, but still enough to let me self-modify out of the pit of borderline-suicidal depression. I also did it alone, without any therapists or friends (in fact, zero people even knew I was depressed). At the time I figured anyone could do it and it was just a matter of willpower… or something. I didn’t pursue the question because back then I hadn’t heard of the phrase “notice your confusion”. Later, I met someone else who was depressed. I dug a little deeper and it turns out all the people who say you need a therapist and social support are right after all. Most people really, really struggle to escape depression. I’m not sure exactly why this is, but I suspect that not having luminosity would be an insurmountable barrier. It was hard enough when I could understand that my thoughts were corrupted and work to minimize them.
Less flashy but just as important, the overpowering desire to win at all costs has let me change from impulsive social cripple to someone who has no trouble fitting into most social circles. Admittedly, that wasn’t something I got from rationality. I started out with the desire to win and gradually tried techniques, discarding what didn’t work and keeping what did. But that seems like a fundamental rationality mindset to me. Cold reading, social web, signalling, inadequate equilibria, and schelling points are likely to greatly enhance my skills in the future.
I don’t think the average common sense is up to any of those tasks. Plenty of people don’t discover they’re trans until a decade or more later than me, and I think it’s highly significant that I figured it out almost as soon as I started rationality. Most people can’t solve depression on their own. And I’ve met exactly one person who makes the leap from “some people are really charismatic” to “maybe I can learn how to be charismatic” to “I should learn how to be charismatic because the power scale goes up to literally taking over an entire country”. Normal people simply don’t think like that.
The only way you could convince me to abandon rationality is to either give me something that would achieve my goals better or convince me that I don’t need to worry about achieving my goals. For example, if we lived in the glorious transhumanist future where FAI takes care of everything complicated, I wouldn’t feel the need to personally become stronger. But as it is, if I close my eyes and sleep, I won’t wake up until I’ve sleepwalked right off a cliff.
Furthermore, I know it’s possible to improve. Why? Because of EY. Reading his writing is like talking to a thousand-year-old vampire. He’s simply better than me, at everything (minus boring nitpicky stuff like [insert obscure skill]). Child EY feels pretty similar to adult me in a lot of ways which makes me think that what’s different between us isn’t so much raw IQ or talent as it is all the self-modification he layered on top.
But I have to admit that I’m puzzled by the lack of rationalist stars. It is written that it takes a lot of rationality to get anywhere, but surely out of the thousands(?) of us, at least a few would have mastered enough. Yes, there’s EY and Scott and… actually that’s all I’m aware of, and I wouldn’t know of either of them if I wasn’t already a rationalist. That feels like one a notice-your-confusion moment, and yet I’m not sure how to reconcile that observation with the way rationality seems intertwined with my own progression. If I only ever had an average helping of rationality… I’d be a depressed, self-hating, incel. Or maybe I’d just be dead. I thought about suicide a lot, and if I hadn’t had a strong belief that I could improve due to past improvements, I might have given up entirely.
I’d say I’m just an unusually talented rationalist but my rationality-fueled common sense says that’s extremely unlikely and suspiciously egocentric. So I’ll just say that I’m not sure what to think anymore.
Rationality might not have a lot of practical value, precisely because things that had such value are already well-engineered by evolution and culture. But it still advances your understanding of the world and yourself a lot, and I personally find that one of my few terminal values.
Incidentally, rationality might imply that Starcraft is a kind of trojan that exploits our reward circuits, and if we want to maximize our values (as opposed to our pleasure), we are well-advised to take a stance against this exploitation.
But it still advances your understanding of the world and yourself a lot
I’m not sure that it does. I certainly haven’t seen any evidence of LessWrong-style rationality being a better means of achieving understanding of the world than, say, just getting a bunch of textbooks and journal articles on whatever you’re interested in and doing some old-fashioned studying.
Incidentally, rationality might imply that Starcraft is a kind of trojan that exploits our reward circuits, and if we want to maximize our values (as opposed to our pleasure), we are well-advised to take a stance against this exploitation.
Alternatively, we might say that rationality is a toolbox, and makes no judgements about what you apply those tools to. If you apply the tools of rationality to become a better Starcraft player, then good for you! You have used rationality to improve your skills and work towards your goal more efficiently. Certainly, I’ve seen a much stronger standards of epistemics in the Starcraft and video game speedrunning communities than in many other places, LessWrong included.
How about: “What is rationality?” and “Will rationality actually help you if you’re not trying to design an AI?”
Don’t get me wrong. I really like LessWrong. I’ve been fairly involved in the Seattle rationality community. Yet, all the same, I can’t help but think that actual rationality hasn’t really helped me all that much in my everyday life. I can point to very few things where I’ve used a Rationality Technique to make a decision, and none of those decisions were especially high-impact.
In my life, rationality has been a hobby. If I weren’t reading the sequences, I’d be arguing about geopolitics, or playing board games. So, to me, the most open question in rationality is, “Why should one bother? What special claim does rationality have over my time and attention that, say, Starcraft does not?”
I keep seeing responses like this, and I don’t understand them at all. Rationality is inseparably intertwined with everything I do. Long before I found LessWrong I strove to be rational because I’m an agent with goals and rationality is a toolset for achieving those goals. Yes, I got a lot of it wrong, but I also got a lot of it right—reading the Sequences felt less like discovering some revelatory holy book and more like refactoring my beliefs.
Afterwards, rationality took less than a year to completely change my life. I don’t study rationality because it’s a hobby; I study it because the level I was at before was insufficient and I can’t afford to make another massive mistake. Sometimes you can’t solve a problem on intuition alone and then you need the big guns—those “rationality techniques”.
And what game have those “big guns” allowed you to bag that the lesser guns of “ordinary common sense” would not have?
There are lots of people who do lots of amazing things without having once read Kahneman, without having once encountered any literature about cognitive biases. If we are proposing that rationality is some kind of special edge that will allow us to accomplish things that other people cannot accomplish, we had better come up with some examples, hadn’t we?
I don’t agree with your dichotomy between rationality techniques and common sense. Common sense is just layman-speak for S1, and S1 can be trained to think rationally. A lot of rationality for me is ingrained into S1 and isn’t something I think about anymore. For example, S1′s response to a math problem is to invoke S2, rather than try to solve it. Why? Because S1 has learned that it cannot solve math problems, even seemingly simple ones. Lightness, precision, and perfectionism are mostly S1 jobs for me as well.
And I’m also not claiming rationality is a prerequisite for victory. Rather, I see it as a power amplifier. If you don’t have any rationality whatsoever, you’re flailing around blind. With a little rationality (maybe just stuff you’ve learned by osmosis) you can chart a rough course, and if you’re already powerful enough that might be all it takes.
But those are relatively minor nitpicks. Let’s talk about how, specifically, rationality has changed my life.
The major one for me is discovering I’m trans. Rationality got me to seriously think about the problem (by telling me that emotions weren’t evil and crazy), and then told me how to decide whether I was actually trans or not (bayesian fermi estimate). It takes many people years or months to figure this out, often with the help of therapists. I did it in a week, alone, and I came out without the doubts and worries that plague normal trans people.
My pre-sequence grasp of rationality was extremely limited, but still enough to let me self-modify out of the pit of borderline-suicidal depression. I also did it alone, without any therapists or friends (in fact, zero people even knew I was depressed). At the time I figured anyone could do it and it was just a matter of willpower… or something. I didn’t pursue the question because back then I hadn’t heard of the phrase “notice your confusion”. Later, I met someone else who was depressed. I dug a little deeper and it turns out all the people who say you need a therapist and social support are right after all. Most people really, really struggle to escape depression. I’m not sure exactly why this is, but I suspect that not having luminosity would be an insurmountable barrier. It was hard enough when I could understand that my thoughts were corrupted and work to minimize them.
Less flashy but just as important, the overpowering desire to win at all costs has let me change from impulsive social cripple to someone who has no trouble fitting into most social circles. Admittedly, that wasn’t something I got from rationality. I started out with the desire to win and gradually tried techniques, discarding what didn’t work and keeping what did. But that seems like a fundamental rationality mindset to me. Cold reading, social web, signalling, inadequate equilibria, and schelling points are likely to greatly enhance my skills in the future.
I don’t think the average common sense is up to any of those tasks. Plenty of people don’t discover they’re trans until a decade or more later than me, and I think it’s highly significant that I figured it out almost as soon as I started rationality. Most people can’t solve depression on their own. And I’ve met exactly one person who makes the leap from “some people are really charismatic” to “maybe I can learn how to be charismatic” to “I should learn how to be charismatic because the power scale goes up to literally taking over an entire country”. Normal people simply don’t think like that.
The only way you could convince me to abandon rationality is to either give me something that would achieve my goals better or convince me that I don’t need to worry about achieving my goals. For example, if we lived in the glorious transhumanist future where FAI takes care of everything complicated, I wouldn’t feel the need to personally become stronger. But as it is, if I close my eyes and sleep, I won’t wake up until I’ve sleepwalked right off a cliff.
Furthermore, I know it’s possible to improve. Why? Because of EY. Reading his writing is like talking to a thousand-year-old vampire. He’s simply better than me, at everything (minus boring nitpicky stuff like [insert obscure skill]). Child EY feels pretty similar to adult me in a lot of ways which makes me think that what’s different between us isn’t so much raw IQ or talent as it is all the self-modification he layered on top.
But I have to admit that I’m puzzled by the lack of rationalist stars. It is written that it takes a lot of rationality to get anywhere, but surely out of the thousands(?) of us, at least a few would have mastered enough. Yes, there’s EY and Scott and… actually that’s all I’m aware of, and I wouldn’t know of either of them if I wasn’t already a rationalist. That feels like one a notice-your-confusion moment, and yet I’m not sure how to reconcile that observation with the way rationality seems intertwined with my own progression. If I only ever had an average helping of rationality… I’d be a depressed, self-hating, incel. Or maybe I’d just be dead. I thought about suicide a lot, and if I hadn’t had a strong belief that I could improve due to past improvements, I might have given up entirely.
I’d say I’m just an unusually talented rationalist but my rationality-fueled common sense says that’s extremely unlikely and suspiciously egocentric. So I’ll just say that I’m not sure what to think anymore.
Rationality might not have a lot of practical value, precisely because things that had such value are already well-engineered by evolution and culture. But it still advances your understanding of the world and yourself a lot, and I personally find that one of my few terminal values. Incidentally, rationality might imply that Starcraft is a kind of trojan that exploits our reward circuits, and if we want to maximize our values (as opposed to our pleasure), we are well-advised to take a stance against this exploitation.
I’m not sure that it does. I certainly haven’t seen any evidence of LessWrong-style rationality being a better means of achieving understanding of the world than, say, just getting a bunch of textbooks and journal articles on whatever you’re interested in and doing some old-fashioned studying.
Alternatively, we might say that rationality is a toolbox, and makes no judgements about what you apply those tools to. If you apply the tools of rationality to become a better Starcraft player, then good for you! You have used rationality to improve your skills and work towards your goal more efficiently. Certainly, I’ve seen a much stronger standards of epistemics in the Starcraft and video game speedrunning communities than in many other places, LessWrong included.