We have many built in heuristics, and most of them are trouble. The absurdity heuristic makes us reject reasonable things out of hand, so we should take the time to fully understand things that seem absurd at first. Some of our beliefs are not reasoned, but inherited; we should sniff those out and discard them. We repeat cached thoughts, so we should clear and rethink them. The affect heuristic is a tricky one; to work around it, we have to take the outside view. Everything we see and do primes us, so for really important decisions, we should never leave our rooms. We fail to attribute agency to things which should have it, like opinions, so if less drastic means don’t work, we should modify English to make ourseves do so.
The problem is, there’s only a limited amount of time in each day. To spend more time thinking about something, we must spend less time on something else. The more we think about each topic, the fewer topics we have time to think about at all. Rationalism gives us a long list of extra things to think about, and angles to think about them from, without guidance on where or how much to apply them. This can make us overthink some things and disastrously underthink others. Our worst mistakes are not those where our thoughts went astray, but those we failed to think about at all. The time between when we learn rationality techniques and when we learn where to apply them is the valley.
Reason, like time and money, is a resource. There are many complex definitions of reason, but I will use a simple one: reason is time spent thinking. We mainly use our reason to make decisions and answer questions; if we do it right, the more reason we spend, the more likely our answer will be correct.. We might question this analogy on the basis that we can’t directly control our thoughts, but then, many people can’t directly control their monetary spending, either; they impulse buy. In both cases, we can control our spending directly, using willpower (which is also a limited resource), or indirectly by finding ways to adjust our routine.
This model is convenient enough to be suspicious, so we should apply some sanity checks to make sure it all adds up to normality. The utility we get from thinking about a decision is the cost of deciding incorrectly times the probability that we’ll change our mind from incorrect to correct, minus the probability that we’ll change our mind from correct to incorrect. From this, we get the highly normal statements thinking has higher expected utility when you’re likely to change your mind and thinking has higher expected utility when the subject is important. With a resource model of reason, we should also expect simple representations for surpluses and shortages. A surplus of reason manifests as boredom; we are bored when we have nothing to do but think, and nothing interesting to think about. A shortage or reason manifests as stress; we’re stressed when we have too much to think about.
When we consider costs as well as benefits, it becomes possible to reason about which techniques are worthwhile. It is not enough to show that a technique will sometimes illuminate truth; to justify its cost, it must be marginally more likely to illuminate truth than the next best technique. On easy questions of little consequence, a single cached thought or a simple heuristic will suffice. On hard problems, most techniques will fail to produce any insight, so we need to try more of them.
Our mind is built on heuristics because they’re efficient. A heuristic is not a bad word, but a way of answering questions cheaply. You shouldn’t base core beliefs or important choices on heuristics alone, but for minor decisions a few simple heuristics may be all you can afford. Core beliefs and important choices, on the other hand, spawn a tree of sub-questions, the leaves of which are answered by heuristics or cached thoughts.
The Overcoming Bias articles on heuristics treat them like villains that sabotage our thoughts. The standard way to prove that a heuristic exists is to present an example where it leads us astray. That means teaching readers, not to avoid using heuristics where they’re inappropriate, but to avoid using them entirely. Fortunately, the architecture of our minds won’t let us do that, since eliminating a heuristic entirely would make us much stupider. Instead, we should focus on learning and teaching what they feel like from the inside, with examples where they lead us astray and examples where they work properly.
In general, the expected return on investment for thinking about a topic starts high, as initial thoughts cut through confusion and affect our decision greatly, then drops as the most productive lines of reasoning are depleted. Once the expected return drops below some threshold, we should stop thinking about it.
Normally, the process for allocating reason works automatically and works well. However, sometimes it breaks. Sometimes we run into questions that we simply can’t resolve with the information we have available. If it’s important, we run through our entire repertoire of techniques before giving up, perhaps guessing, and moving on. If it’s less important, we try only the techniques that we think are likely to work before we give up. If you teach someone more techniques, then you increase the amount of time they can spend on a topic before running out of angles and being forced to move on. If those techniques fail to produce insight, then they make him stupider; he will spend more time on questions for little benefit, and ignore more. Some people are completely unable to budget their reason, like the man who spends ten minutes deciding what to order in a restaurant, knowing fully that he would be happier spending those ten minutes focused on conversation instead. If you teach him enough statistics, he might be foolish enough to try to calculate the probability of various dishes making him happy. He’ll fail, of course, because statistics can’t answer that question with the data he has, but he’ll waste even more time trying.
It would be nice to have more reason, but evidence points to cognitive capacity being a fixed quantity. We can, however, allocate the reason we do have more efficiently. We can set cutoff points to limit the time spent on silly things like restaurant orders. Some things are known to be wastes of reason; politics is the mind killer because it can use an unlimited amount of mental energy without producing the slightest bit of utility. We can identify the thoughts that are least valuable to us by observing what our mind goes to when we’re bored: mostly, we daydream and retread old lines of thought. That means that when there are worthwhile topics to think about, daydreaming and retreading should be the first things to go. This conclusion shouldn’t surprise anyone, but it’s good to have theoretical justification.
Take a moment to think about what you spend time thinking about, and where your cutoff point is. Do you keep thinking about the same topic well past the point where insights stop coming? Do you get distracted and stop too early? If you decide unconsciously, would your conscious choice be the same or different?
How Much Thought
We have many built in heuristics, and most of them are trouble. The absurdity heuristic makes us reject reasonable things out of hand, so we should take the time to fully understand things that seem absurd at first. Some of our beliefs are not reasoned, but inherited; we should sniff those out and discard them. We repeat cached thoughts, so we should clear and rethink them. The affect heuristic is a tricky one; to work around it, we have to take the outside view. Everything we see and do primes us, so for really important decisions, we should never leave our rooms. We fail to attribute agency to things which should have it, like opinions, so if less drastic means don’t work, we should modify English to make ourseves do so.
All of these articles bear the same message, the same message that can be easily found in the subtext of every book, treatise and example of rationality. Think more. Look for the third alternative. Challenge your deeply held beliefs. Drive through semantic stop signs. Prepare a line of retreat. If you don’t understand, you should make an extraordinary effort. When you do find cause to change your beliefs, complete a checklist, run a script and follow a ritual. Recheck your answers, because thinking helps; more thought is always better.
The problem is, there’s only a limited amount of time in each day. To spend more time thinking about something, we must spend less time on something else. The more we think about each topic, the fewer topics we have time to think about at all. Rationalism gives us a long list of extra things to think about, and angles to think about them from, without guidance on where or how much to apply them. This can make us overthink some things and disastrously underthink others. Our worst mistakes are not those where our thoughts went astray, but those we failed to think about at all. The time between when we learn rationality techniques and when we learn where to apply them is the valley.
Reason, like time and money, is a resource. There are many complex definitions of reason, but I will use a simple one: reason is time spent thinking. We mainly use our reason to make decisions and answer questions; if we do it right, the more reason we spend, the more likely our answer will be correct.. We might question this analogy on the basis that we can’t directly control our thoughts, but then, many people can’t directly control their monetary spending, either; they impulse buy. In both cases, we can control our spending directly, using willpower (which is also a limited resource), or indirectly by finding ways to adjust our routine.
This model is convenient enough to be suspicious, so we should apply some sanity checks to make sure it all adds up to normality. The utility we get from thinking about a decision is the cost of deciding incorrectly times the probability that we’ll change our mind from incorrect to correct, minus the probability that we’ll change our mind from correct to incorrect. From this, we get the highly normal statements thinking has higher expected utility when you’re likely to change your mind and thinking has higher expected utility when the subject is important. With a resource model of reason, we should also expect simple representations for surpluses and shortages. A surplus of reason manifests as boredom; we are bored when we have nothing to do but think, and nothing interesting to think about. A shortage or reason manifests as stress; we’re stressed when we have too much to think about.
When we consider costs as well as benefits, it becomes possible to reason about which techniques are worthwhile. It is not enough to show that a technique will sometimes illuminate truth; to justify its cost, it must be marginally more likely to illuminate truth than the next best technique. On easy questions of little consequence, a single cached thought or a simple heuristic will suffice. On hard problems, most techniques will fail to produce any insight, so we need to try more of them.
Our mind is built on heuristics because they’re efficient. A heuristic is not a bad word, but a way of answering questions cheaply. You shouldn’t base core beliefs or important choices on heuristics alone, but for minor decisions a few simple heuristics may be all you can afford. Core beliefs and important choices, on the other hand, spawn a tree of sub-questions, the leaves of which are answered by heuristics or cached thoughts.
The Overcoming Bias articles on heuristics treat them like villains that sabotage our thoughts. The standard way to prove that a heuristic exists is to present an example where it leads us astray. That means teaching readers, not to avoid using heuristics where they’re inappropriate, but to avoid using them entirely. Fortunately, the architecture of our minds won’t let us do that, since eliminating a heuristic entirely would make us much stupider. Instead, we should focus on learning and teaching what they feel like from the inside, with examples where they lead us astray and examples where they work properly.
In general, the expected return on investment for thinking about a topic starts high, as initial thoughts cut through confusion and affect our decision greatly, then drops as the most productive lines of reasoning are depleted. Once the expected return drops below some threshold, we should stop thinking about it.
Normally, the process for allocating reason works automatically and works well. However, sometimes it breaks. Sometimes we run into questions that we simply can’t resolve with the information we have available. If it’s important, we run through our entire repertoire of techniques before giving up, perhaps guessing, and moving on. If it’s less important, we try only the techniques that we think are likely to work before we give up. If you teach someone more techniques, then you increase the amount of time they can spend on a topic before running out of angles and being forced to move on. If those techniques fail to produce insight, then they make him stupider; he will spend more time on questions for little benefit, and ignore more. Some people are completely unable to budget their reason, like the man who spends ten minutes deciding what to order in a restaurant, knowing fully that he would be happier spending those ten minutes focused on conversation instead. If you teach him enough statistics, he might be foolish enough to try to calculate the probability of various dishes making him happy. He’ll fail, of course, because statistics can’t answer that question with the data he has, but he’ll waste even more time trying.
It would be nice to have more reason, but evidence points to cognitive capacity being a fixed quantity. We can, however, allocate the reason we do have more efficiently. We can set cutoff points to limit the time spent on silly things like restaurant orders. Some things are known to be wastes of reason; politics is the mind killer because it can use an unlimited amount of mental energy without producing the slightest bit of utility. We can identify the thoughts that are least valuable to us by observing what our mind goes to when we’re bored: mostly, we daydream and retread old lines of thought. That means that when there are worthwhile topics to think about, daydreaming and retreading should be the first things to go. This conclusion shouldn’t surprise anyone, but it’s good to have theoretical justification.
Take a moment to think about what you spend time thinking about, and where your cutoff point is. Do you keep thinking about the same topic well past the point where insights stop coming? Do you get distracted and stop too early? If you decide unconsciously, would your conscious choice be the same or different?