The Problem of Thinking Too Much [LINK]
This was linked to twice recently, once in a Rationality Quotes thread and once in the article about mindfulness meditation, and I thought it deserved its own article.
It’s a transcript of a talk by Persi Diaconis, called “The problem of thinking too much”. The general theme is more or less what you’d expect from the title: often our explicit models of things are wrong enough that trying to think them through rationally gives worse results than (e.g.) just guessing. There are some nice examples in it.
What is a good heuristic for sorting things into not worth reasoning about and worth reasoning about?
Basically, if I can handle the situation off “intuition”, then I’m fine. If intuition fails, then I need to sit down, do some reasoning, and build a better intuition.
Or, in other words, if you KNOW you have a problem, then sit down and solve the problem. AVOID going off and looking just to see if you MIGHT have a problem.
The key here is that you should never reason about the immediate challenge. Don’t ask “what should I order today?”, instead ask “How should I determine my order in the future?” (I’m very fond of the “first item that looks appealing, and then immediately put away menu so I don’t see any conflicting options”—the lost utility from a “sub par” decision is far outweighed by the utility I would have lost agonizing over the choice, and there’s a fair amount of research to back this up)
As an example: I rarely have any trouble deciding what to wear in the morning. Since I don’t have a problem, I don’t think about it at all. A few months ago, however, I decided I wanted to wear skirts more, so I reasoned out that I should (a) buy more skirts and (b) rearrange my closet so that skirts are very accessible while my slacks are off to the side.
P.S. Not sure if it’s a useful pointer, but this idea came to me after a recent re-read of “The lens that sees its flaws”—the goal is not to spend time hunting down possible flaws, it’s to fix the flaws we are already aware of. It’s definitely an area I’m still just beginning to explore, though.
Hmm. Maybe, can you think of any biases that would influence this problem? If no, go with intuition. This requires prerequisite knowledge and practice to be useful though.
Perhaps: if the outcome of a decision is not very important, go with your gut.
One offered in the article:
Maybe: when you think your answer is good enough, go with it. (So no double guessing yourself or thinking that to be a good rationalist, you should spend more time thinking things through)
Wow, that is indeed a nice example. It also reminds me of a problem I had to solve at work. I failed to Google up an existing solution, but I figured that was just due to my ignorance of proper terms to describe it. Now I’m not so sure.
Further reading:
http://en.wikipedia.org/wiki/Bin_packing_problem
http://en.wikipedia.org/wiki/Knapsack_problem
I agree. Don’t complicate. Simple is more efficient.