And just because we couldn’t do it perfectly doesn’t mean we’re not better than the alternatives.
I wonder how well a group whose members didn’t study how to think and instead devoted themselves to not letting emotions interfere with their decisions would do. All its work would be advances, I think—there would be no analog to the “valley of rationality” in which people lost touch with their intuitions and made poor decisions.
In fact, I would assert the exact opposite: that attempting to remove emotions from decisionmaking is what causes the “valley of rationality.” Furthermore, I suspect it is a necessary transitional phase, comparable in it’s horrific necessity to the process of re-breaking a mangled shinbone so that it can heal straight.
attempting to remove emotions from decisionmaking is what causes the “valley of rationality.”
I disagree with the implication of this. I think the main causes are misusing tools like Bayesian updating and considering what a rationalist would do, and trying to do that.
Insofar as poorly calibrated emotions are part of the problem, one must subtract the problems that would have been caused by them under non-aspiring rationalist conditions from those under aspiring-rationalist conditions to calculate what the aspiring rationalism is responsible for. I don’t think this usually leaves much left over, positive or negative.
I wonder how well a group whose members didn’t study how to think and instead devoted themselves to not letting emotions interfere with their decisions would do. All its work would be advances, I think—there would be no analog to the “valley of rationality” in which people lost touch with their intuitions and made poor decisions.
I dispute your claim.
In fact, I would assert the exact opposite: that attempting to remove emotions from decisionmaking is what causes the “valley of rationality.” Furthermore, I suspect it is a necessary transitional phase, comparable in it’s horrific necessity to the process of re-breaking a mangled shinbone so that it can heal straight.
I’m well disposed towards your viewpoint on that.
I disagree with the implication of this. I think the main causes are misusing tools like Bayesian updating and considering what a rationalist would do, and trying to do that.
Insofar as poorly calibrated emotions are part of the problem, one must subtract the problems that would have been caused by them under non-aspiring rationalist conditions from those under aspiring-rationalist conditions to calculate what the aspiring rationalism is responsible for. I don’t think this usually leaves much left over, positive or negative.