Basically, the problem is that K&T-style insights about cognitive biases—and, by extension, the whole OB/LW folklore that has arisen around them—are useless for pretty much any question of practical importance. This is true both with regards to personal success and accomplishment (a.k.a. “instrumental rationality”) and pure intellectual curiosity (a.k.a. “epistemic rationality”).
From the point of view of a human being, the really important questions are worlds apart from anything touched by these neat academic categorizations of biases. Whom should I trust? What rules are safe to break? What rules am I in fact expected to break? When do social institutions work as advertised, and when is there in fact conniving and off-the-record tacit understanding that I’m unaware of? What do other people really think about me? For pretty much anything that really matters, the important biases are those that you have about questions of this sort—and knowing about the artificial lab scenarios where anchoring, conjunction fallacies, etc. are observable won’t give you any advantage there.
Note that this applies to your biases about abstract intellectual topics just as much as to your practical life. Whatever you know about any such topic, you know largely ad verecundiam from the intellectual authorities you trust, so that chances are you have inherited their biases wholesale. (An exception here is material that stands purely on rigorous internal logical evidence, like mathematical proofs, but there isn’t much you can do with that beyond pure math.) And to answer the question of what biases might be distorting the output of the official intellectual authorities in the system you live under, you need to ask hard questions about human nature and behavior akin to the above listed ones, and accurately detect biases far more complex and difficult than anything within the reach of the simplistic behavioral economics.
Of course, the problem you ultimately run into is that such analysis, if done consistently and accurately, will produce results that clash with the social norms you live under. Which leads to the observation that some well-calibrated instinctive bias towards conformity is usually good for you.
I really like this post. Could you make the link go both ways? That said, I think you are overstating your case. Also, if you figure out what local social norms are and that the stories are BS, you can accomodate the norms and ignore the stories internally. You can also optimize separate internal stories and external ones, or alternatively, drop out of the official story entirely and just be some guy who hangs around and is fun to talk to and mysteriously seems to always have enough money for his needs (the secret being largely that one’s needs turn out to be very cheap to fulfill, even extravagantly, if optimized for directly, and money is likewise easy to get if optimized for directly). If you aren’t dependent on others, don’t compete, dont make demands, and are helpful and pleasant, you can get away with not conforming.
Basically, the problem is that K&T-style insights about cognitive biases—and, by extension, the whole OB/LW folklore that has arisen around them—are useless for pretty much any question of practical importance.
I agree for most topics, but there are applied cases of clear importance. Investment behavior provides particularly concrete and rich examples, which are a major focus for the K&T school, and “libertarian paternalists” inspired by them: index funds as preferable to overconfident trading by investors, setting defaults of employee investment plans to “save and invest” rather than “nothing,” and so forth. Now, you can get these insights packaged with financial advice in books and the like, and I think that tends to be more useful than general study of biases, but the insights are nonetheless important to the tune of tens or hundreds of thousands of dollars over a lifetime.
are useless for pretty much any question of practical importance
Worse than useless, give illusion of insight.
(And I feel like many comments on this post are sort of exemplary of that problem—as you put it in a different context, the equivalent of magic healing crystals are being talked about in a frighteningly serious manner.)
Basically, the problem is that K&T-style insights about cognitive biases—and, by extension, the whole OB/LW folklore that has arisen around them—are useless for pretty much any question of practical importance. This is true both with regards to personal success and accomplishment (a.k.a. “instrumental rationality”) and pure intellectual curiosity (a.k.a. “epistemic rationality”).
From the point of view of a human being, the really important questions are worlds apart from anything touched by these neat academic categorizations of biases. Whom should I trust? What rules are safe to break? What rules am I in fact expected to break? When do social institutions work as advertised, and when is there in fact conniving and off-the-record tacit understanding that I’m unaware of? What do other people really think about me? For pretty much anything that really matters, the important biases are those that you have about questions of this sort—and knowing about the artificial lab scenarios where anchoring, conjunction fallacies, etc. are observable won’t give you any advantage there.
Note that this applies to your biases about abstract intellectual topics just as much as to your practical life. Whatever you know about any such topic, you know largely ad verecundiam from the intellectual authorities you trust, so that chances are you have inherited their biases wholesale. (An exception here is material that stands purely on rigorous internal logical evidence, like mathematical proofs, but there isn’t much you can do with that beyond pure math.) And to answer the question of what biases might be distorting the output of the official intellectual authorities in the system you live under, you need to ask hard questions about human nature and behavior akin to the above listed ones, and accurately detect biases far more complex and difficult than anything within the reach of the simplistic behavioral economics.
Of course, the problem you ultimately run into is that such analysis, if done consistently and accurately, will produce results that clash with the social norms you live under. Which leads to the observation that some well-calibrated instinctive bias towards conformity is usually good for you.
I really like this post. Could you make the link go both ways?
That said, I think you are overstating your case.
Also, if you figure out what local social norms are and that the stories are BS, you can accomodate the norms and ignore the stories internally. You can also optimize separate internal stories and external ones, or alternatively, drop out of the official story entirely and just be some guy who hangs around and is fun to talk to and mysteriously seems to always have enough money for his needs (the secret being largely that one’s needs turn out to be very cheap to fulfill, even extravagantly, if optimized for directly, and money is likewise easy to get if optimized for directly). If you aren’t dependent on others, don’t compete, dont make demands, and are helpful and pleasant, you can get away with not conforming.
Sure.
If this isn’t a joke, how does it balance VMs overstatement?
It’s an alternative to having a well-calibrated bias towards conformity.
I agree for most topics, but there are applied cases of clear importance. Investment behavior provides particularly concrete and rich examples, which are a major focus for the K&T school, and “libertarian paternalists” inspired by them: index funds as preferable to overconfident trading by investors, setting defaults of employee investment plans to “save and invest” rather than “nothing,” and so forth. Now, you can get these insights packaged with financial advice in books and the like, and I think that tends to be more useful than general study of biases, but the insights are nonetheless important to the tune of tens or hundreds of thousands of dollars over a lifetime.
Worse than useless, give illusion of insight.
(And I feel like many comments on this post are sort of exemplary of that problem—as you put it in a different context, the equivalent of magic healing crystals are being talked about in a frighteningly serious manner.)
By Django, that needed saying!