Yes. I’ve been a semi-regular reader of OCB for about a year. I think it’s an interesting blog. But have I learned anything useful from it? Has it made any practical difference in the choices I make, either day-to-day or longterm? The answer is no. Admittedly, this may be my own fault. But I recall a post, not too long ago, soliciting people’s feedback on “the most important thing you learned from OCB in the past year,” or something of that sort. And while there were lots of people excitedly posting about how much OCB has taught them, the examples they gave were along the lines of “I learned the power of fundamental attribution error!” or “I learned the importance of continually adjusting my priors!” with curiously few examples of real differences OCB made in anyone’s practical choices. This raises the question: if tweaking our rationality has no appreciable affect on anything, then how can we say we’re really tweaking our rationality at all? Perhaps we’re just swapping new explanations for fundamentally irrational processes that are far too buried and obscure to be accessible to us.
That said, I think things like the recent posts on akrasia are strong moves in the right direction. Intellectually interesting, but with easy to grasp real-world implications.
This sounds an awful lot like one of the examples I gave above. Ok, so you’re focused on “risk reduction” and “reducing akrasia.” So what’s that mean? You’ve decided to buckle-up, wear sunscreen, and not be so lazy? Can’t I get that from Reader’s Digest or my mom?
Telling people to buckle up is nothing special. Successfully persuading people to buckle up—helping people understand and fix the internal sources of error that stood in the way of doing so in the past—will save a life if you can do it enough.
The problem is that even though learning to identify and avoid certain biases will affect your behavior, there’s no easy way to articulate those effects. It comes mainly from things not done, not things done.
For instance, upon hearing a fallacious argument, being aware of its fallacies causes the hearer not to believe in it, where he previously would have. Or if he thinks something on his own—previously a bias would have caused him to think a certain thought, which would have led to a certain action. Now, having learned to identify the bias, he doesn’t even generate that thought, but instead another, which leads to him taking a different action. While these things do certainly have an effect, they’re too subtle to identify. You’re not going to know the thoughts you avoided (even if you can try to guess), only the ones you’ve actually thought.
I feel this has largely been the case for me. My behavior has certainly been affected because I now think more clearly. That, I’m pretty certain of. But can I give any concrete examples? I’m afraid not. The effect is on a too subtle of a level for me to properly observe. But that doesn’t mean there aren’t any concrete examples, it only means I can’t verbalize them.
This debate has already played out in attacking and defending Pragmatism.
A lot of the rubrics by which to judge whether or not rationalism wins or whether or not rationalism is an end in itself involve assigning meaning and value on a very abstract level. Eliezer’s posts outline a reductionist, materialist standpoint with some strong beliefs about following the links of causality. Rationalism follows, but rationalism isn’t going to prove itself true.
Deciding that rationalism is the best answer for your axiomatic belief system requires taking a metaphysical stand; I think that if you are looking for a definite metaphysical reason that you should practice rationalism, then you are interested in something that the practice of rationalism is not going to help much.
“Rationalism” as compared to what? Mysticism? Random decision-making? Of course rational behavior is going to be by far the best choice for achieving one’s particular ends. I wasn’t questioning the entire concept of rationalism, which clearly has been the driving force behind human progress for all of history. I was questioning how much making “tweaks”—the kind discussed here and on OCB—can do for us. Or have done for us. Put differently, is perseverating on rationality per se worth my time? Can anyone show that paying special attention to rationality has measurable results, controlling for IQ?
Mysticism and random decision making are both acceptable and highly successful methods of making decisions; most of human history has relied on those two… we still rely on them. If you are a consequentialist, you can ignore the process and just rate the outcome; who cares why nice hair is correlated with success -it just is! Why does democracy work?
What makes rationalism worth the time is probably your regard for the process itself or for its outcomes. If its the outcomes then you might want to consider other options; following your biases and desires almost blindly works out pretty well for most people.
Yes. I’ve been a semi-regular reader of OCB for about a year. I think it’s an interesting blog. But have I learned anything useful from it? Has it made any practical difference in the choices I make, either day-to-day or longterm? The answer is no. Admittedly, this may be my own fault. But I recall a post, not too long ago, soliciting people’s feedback on “the most important thing you learned from OCB in the past year,” or something of that sort. And while there were lots of people excitedly posting about how much OCB has taught them, the examples they gave were along the lines of “I learned the power of fundamental attribution error!” or “I learned the importance of continually adjusting my priors!” with curiously few examples of real differences OCB made in anyone’s practical choices. This raises the question: if tweaking our rationality has no appreciable affect on anything, then how can we say we’re really tweaking our rationality at all? Perhaps we’re just swapping new explanations for fundamentally irrational processes that are far too buried and obscure to be accessible to us.
That said, I think things like the recent posts on akrasia are strong moves in the right direction. Intellectually interesting, but with easy to grasp real-world implications.
Ob has changed people’s practical lives in some major ways. Not all of these are mine personally:
“I donated more money to anti aging, risk reduction, etc”
“I signed up for cryonics.”
“I wear a seatbelt in a taxi even when no one else does.”
“I stopped going to church but started hanging out socially with aspiring rationalists.”
“I decided rationality works and started writing down my goals and pathways to them.”
“I decided it’s important for me to think carefully about what my ultimate values are.”
Yes! Or even further, “I am now focusing my life on risk reduction and have significantly reduced akrasia in all facets of my life.”
This sounds an awful lot like one of the examples I gave above. Ok, so you’re focused on “risk reduction” and “reducing akrasia.” So what’s that mean? You’ve decided to buckle-up, wear sunscreen, and not be so lazy? Can’t I get that from Reader’s Digest or my mom?
Telling people to buckle up is nothing special. Successfully persuading people to buckle up—helping people understand and fix the internal sources of error that stood in the way of doing so in the past—will save a life if you can do it enough.
The problem is that even though learning to identify and avoid certain biases will affect your behavior, there’s no easy way to articulate those effects. It comes mainly from things not done, not things done.
For instance, upon hearing a fallacious argument, being aware of its fallacies causes the hearer not to believe in it, where he previously would have. Or if he thinks something on his own—previously a bias would have caused him to think a certain thought, which would have led to a certain action. Now, having learned to identify the bias, he doesn’t even generate that thought, but instead another, which leads to him taking a different action. While these things do certainly have an effect, they’re too subtle to identify. You’re not going to know the thoughts you avoided (even if you can try to guess), only the ones you’ve actually thought.
I feel this has largely been the case for me. My behavior has certainly been affected because I now think more clearly. That, I’m pretty certain of. But can I give any concrete examples? I’m afraid not. The effect is on a too subtle of a level for me to properly observe. But that doesn’t mean there aren’t any concrete examples, it only means I can’t verbalize them.
This debate has already played out in attacking and defending Pragmatism.
A lot of the rubrics by which to judge whether or not rationalism wins or whether or not rationalism is an end in itself involve assigning meaning and value on a very abstract level. Eliezer’s posts outline a reductionist, materialist standpoint with some strong beliefs about following the links of causality. Rationalism follows, but rationalism isn’t going to prove itself true.
Deciding that rationalism is the best answer for your axiomatic belief system requires taking a metaphysical stand; I think that if you are looking for a definite metaphysical reason that you should practice rationalism, then you are interested in something that the practice of rationalism is not going to help much.
“Rationalism” as compared to what? Mysticism? Random decision-making? Of course rational behavior is going to be by far the best choice for achieving one’s particular ends. I wasn’t questioning the entire concept of rationalism, which clearly has been the driving force behind human progress for all of history. I was questioning how much making “tweaks”—the kind discussed here and on OCB—can do for us. Or have done for us. Put differently, is perseverating on rationality per se worth my time? Can anyone show that paying special attention to rationality has measurable results, controlling for IQ?
Mysticism and random decision making are both acceptable and highly successful methods of making decisions; most of human history has relied on those two… we still rely on them. If you are a consequentialist, you can ignore the process and just rate the outcome; who cares why nice hair is correlated with success -it just is! Why does democracy work?
What makes rationalism worth the time is probably your regard for the process itself or for its outcomes. If its the outcomes then you might want to consider other options; following your biases and desires almost blindly works out pretty well for most people.