Nice post! You didn’t explicitly ask for criticism, but I’m going to give some anyway:
I think the standard font-size on LessWrong is smaller. Most people would prefer it if you used that.
I think there’s definitely interest on LessWrong for improving intuition, but I would frame it as “Training intuition to make its judgements more rational” rather than (as your post leans towards) “Forget rationality and harness our natural biases!”. This is mostly just a terminological difference.
The System 1/System 2 distinction is really between System 1 being (fast, intuitive, subconscious) and System 2 being (slow, deliberative, conscious). Around these parts, the word “rationality” tends to be used to mean something like “succeeding by using any and all means”. Under this definition, rationality can use both System 2 and System 1 type thinking. Thus I believe your post could be improved by taking the sentences where intuition is being contrasted with “rationality” and replacing the word “rationality” with something like “deliberate thought” or “System 2″.
As I say above, this is really just a terminological difference, but I think that making it will clarify some of the ideas in the post. In particular, I think that the main content of the post (the seven ways of improving our heuristic judgements), is really useful instrumental rationality, but that the introduction and conclusion hide it in poorly backed up statements about how reducing bias is less important than using good heuristics. I find it strange that these things are even presented in contrast; the techniques you give for improving intuition are techniques for reducing bias. The intuitive judgements become more accurate (i.e. less biased) than they were before.
Some bits of the two “Concluding thoughts” paragraphs seem especially washy. A general sentiment of “System 1 should work in harmony with System 2″ sounds nice, but without any data to back it up it could just be complete bollocks. Maybe we should all be using System 1 all the time. Or maybe there are some activities where System 1 wins and some where System 2 wins. If so, which activities are which? Are firefighters actually successful decision makers?
One final thought: Do the seven methods really focus on System 1 only? Many of them seem like general purpose techniques, an in particular I think that 4, 5, 6, and 7 are actually more System 2.
After doing a bit more reading here and thinking about your comments, I think I’ll focus on the 7 methods and eliminate much of the low quality fluff that make up the intro/conclusion for the next version.
I think some of my confusion was due to unsubstantiated assumptions about the standard views of LessWrong. What I’ve been thinking of bias is closer to Inductive bias than the standard definition, which refers to error patterns. I then interpreted rationality as “overcoming bias”. Inductive bias can be useful, and the idea of overcoming bias of that type seemed to be taking things too far. That doesn’t seem to be what anyone is actually advocating here though.
I think the standard font-size on LessWrong is smaller. Most people would prefer it if you used that.
Apologies for the font size, I was editing in Google Docs rather than the post editor...
As I say above, this is really just a terminological difference, but I think that making it will clarify some of the ideas in the post. In particular, I think that the main content of the post (the seven ways of improving our heuristic judgements), is really useful instrumental rationality, but that the introduction and conclusion hide it in poorly backed up statements about how reducing bias is less important than using good heuristics. I find it strange that these things are even presented in contrast; the techniques you give for improving intuition are techniques for reducing bias. The intuitive judgements become more accurate (i.e. less biased) than they were before.
I admit, terminology is an issue. I perhaps bit off a bit more than I can chew for a first post. I’ll try to fix that.
One final thought: Do the seven methods really focus on System 1 only? Many of them seem like general purpose techniques, an in particular I think that 4, 5, 6,and 7 are actually more System 2.
From the way Klein describes them, they are meant to accelerate expertise. If my interpretation is correct, they are using system 2 to develop system1 for the next scenario. I think part of the problem with how I’m describing this is that experience, which is instrumental in developing expertise, develops intuition. Intuition can either help or hurt. Sometimes we won’t know which until after a decision has been made, other times we might be able to prevent mistakes by running through a checklist of cognitive biases. In the former case, the methods should help next time. In the latter case, you need something (from system 1 for example) to run through the checklist. The checklist on its own isn’t very useful.
Nice post! You didn’t explicitly ask for criticism, but I’m going to give some anyway:
I think the standard font-size on LessWrong is smaller. Most people would prefer it if you used that.
I think there’s definitely interest on LessWrong for improving intuition, but I would frame it as “Training intuition to make its judgements more rational” rather than (as your post leans towards) “Forget rationality and harness our natural biases!”. This is mostly just a terminological difference.
The System 1/System 2 distinction is really between System 1 being (fast, intuitive, subconscious) and System 2 being (slow, deliberative, conscious). Around these parts, the word “rationality” tends to be used to mean something like “succeeding by using any and all means”. Under this definition, rationality can use both System 2 and System 1 type thinking. Thus I believe your post could be improved by taking the sentences where intuition is being contrasted with “rationality” and replacing the word “rationality” with something like “deliberate thought” or “System 2″.
As I say above, this is really just a terminological difference, but I think that making it will clarify some of the ideas in the post. In particular, I think that the main content of the post (the seven ways of improving our heuristic judgements), is really useful instrumental rationality, but that the introduction and conclusion hide it in poorly backed up statements about how reducing bias is less important than using good heuristics. I find it strange that these things are even presented in contrast; the techniques you give for improving intuition are techniques for reducing bias. The intuitive judgements become more accurate (i.e. less biased) than they were before.
Some bits of the two “Concluding thoughts” paragraphs seem especially washy. A general sentiment of “System 1 should work in harmony with System 2″ sounds nice, but without any data to back it up it could just be complete bollocks. Maybe we should all be using System 1 all the time. Or maybe there are some activities where System 1 wins and some where System 2 wins. If so, which activities are which? Are firefighters actually successful decision makers?
One final thought: Do the seven methods really focus on System 1 only? Many of them seem like general purpose techniques, an in particular I think that 4, 5, 6, and 7 are actually more System 2.
After doing a bit more reading here and thinking about your comments, I think I’ll focus on the 7 methods and eliminate much of the low quality fluff that make up the intro/conclusion for the next version.
I think some of my confusion was due to unsubstantiated assumptions about the standard views of LessWrong. What I’ve been thinking of bias is closer to Inductive bias than the standard definition, which refers to error patterns. I then interpreted rationality as “overcoming bias”. Inductive bias can be useful, and the idea of overcoming bias of that type seemed to be taking things too far. That doesn’t seem to be what anyone is actually advocating here though.
Again, thanks.
Thanks for the comments, criticism is welcomed.
Apologies for the font size, I was editing in Google Docs rather than the post editor...
I admit, terminology is an issue. I perhaps bit off a bit more than I can chew for a first post. I’ll try to fix that.
From the way Klein describes them, they are meant to accelerate expertise. If my interpretation is correct, they are using system 2 to develop system1 for the next scenario. I think part of the problem with how I’m describing this is that experience, which is instrumental in developing expertise, develops intuition. Intuition can either help or hurt. Sometimes we won’t know which until after a decision has been made, other times we might be able to prevent mistakes by running through a checklist of cognitive biases. In the former case, the methods should help next time. In the latter case, you need something (from system 1 for example) to run through the checklist. The checklist on its own isn’t very useful.
Again, thanks for the feedback.
Good point.
Up. Small steps in goal achievement is good if your already have a minimal list to fulfil. Think in biases when cookies are nearby.