Perhaps, but it would surprise me if you don’t have hundreds of common sudoku patterns in your memory. Not entire puzzles, but heuristics for solving limited parts of the puzzle. That’s how humans learn. We do pattern recognition whenever possible and fall back on reason when we’re stumped. “Learning” substantially consists of developing the heuristics that allow you to perform without reason (which is slow and error-prone).
Right, and if doing computer-generated sudokus is a kata for developing the heuristics for doing sudokus, then perhaps solving computer-generated logic problems could be a kata for developing the heuristics for rationality.
I think we need to distinguish between some related things here:
Rote learning is the stuff of katas, multiplication tables, etc. It’s not rationality in itself, but reason works best if you have a lot of reliable premises.
Developing heuristics is the stuff of everyday education. Most people get years of this stuff, and it’s what makes most people as rational as they are.
Crystallized intelligence it the ability to reason by applying heuristics. Most people aren’t very good at it, which is the main limitation on education. AFAIK, we don’t know how to give people more.
Fluid intelligence is the ability to reason creatively without heuristics. It’s the closest to what I mean by “rationality”, but also the hardest to train.
Interesting. I think of heuristics as being almost the same as cognitive biases. If it helps System 1, it’s a heuristic. If it gets in the way of System 2, it’s a cognitive bias.
Not a disagreement, just an observation that we are using language differently.
I basically agree. A heuristic lets System 1 function without invoking (the much slower) System 2. We need heuristics to get through the day; we couldn’t function if we had to reason out every single behavior we implement. A bias is a heuristic when it’s dysfunctional, resulting in a poorly-chosen System 1 behavior when System 2 could give a significantly better outcome.
One barrier to rationality is that updating one’s heuristics is effortful and often kind of annoying, so we always have some outdated heuristics. The quicker things change, the worse it gets. Too much trust in one’s heuristics risks biased behavior; too little yields indecisiveness.
Perhaps, but it would surprise me if you don’t have hundreds of common sudoku patterns in your memory. Not entire puzzles, but heuristics for solving limited parts of the puzzle. That’s how humans learn. We do pattern recognition whenever possible and fall back on reason when we’re stumped. “Learning” substantially consists of developing the heuristics that allow you to perform without reason (which is slow and error-prone).
Right, and if doing computer-generated sudokus is a kata for developing the heuristics for doing sudokus, then perhaps solving computer-generated logic problems could be a kata for developing the heuristics for rationality.
The problem is that to be good at rationality you need to be good at interacting with the real-world with all it’s uncertainty.
I think we need to distinguish between some related things here:
Rote learning is the stuff of katas, multiplication tables, etc. It’s not rationality in itself, but reason works best if you have a lot of reliable premises.
Developing heuristics is the stuff of everyday education. Most people get years of this stuff, and it’s what makes most people as rational as they are.
Crystallized intelligence it the ability to reason by applying heuristics. Most people aren’t very good at it, which is the main limitation on education. AFAIK, we don’t know how to give people more.
Fluid intelligence is the ability to reason creatively without heuristics. It’s the closest to what I mean by “rationality”, but also the hardest to train.
Executive function includes some basic cognitive processes that govern people’s behavior. Unfortunately, it is almost entirely (86-92%) genetic.
Interesting. I think of heuristics as being almost the same as cognitive biases. If it helps System 1, it’s a heuristic. If it gets in the way of System 2, it’s a cognitive bias.
Not a disagreement, just an observation that we are using language differently.
I basically agree. A heuristic lets System 1 function without invoking (the much slower) System 2. We need heuristics to get through the day; we couldn’t function if we had to reason out every single behavior we implement. A bias is a heuristic when it’s dysfunctional, resulting in a poorly-chosen System 1 behavior when System 2 could give a significantly better outcome.
One barrier to rationality is that updating one’s heuristics is effortful and often kind of annoying, so we always have some outdated heuristics. The quicker things change, the worse it gets. Too much trust in one’s heuristics risks biased behavior; too little yields indecisiveness.