An interesting post. You started with the assumption that formal reasoning is the right way to go and found out that it’s not necessarily so. Let me start from the opposite end: the observation that the great majority of people reason all the time by pattern-matching, this is the normal, default, bog-standard way of figuring things out.
You do not need to “retrain” people to think in patterns—they do so naturally.
Looking at myself, I certainly do think in terms of patterns—internal maps and structures. Typically I carry a more-or-less coherent map of the subject in my head (which certain areas being fuzzy or incomplete, that’s fine) and the map is kinda-spatial. When a new piece of data comes in, I try to fit it into the existing (in my head) structure and see if it’s a good fit. If it’s not a good fit, it’s like a pebble in a shoe—an irritant and an obvious problem. The problem is fixed either by reinterpreting the data and its implications, or by bending and adjusting the structure so there is a proper place for the new data nugget. Sometimes both happen.
Formal reasoning is atypical for me, that’s why I’m not that good at math. I find situations where you have enough hard data to formally reason about it to be unusual and rare (that would probably be different if I were an engineer or an accountant :-D). Most often you have stochastic reasoning with probability distributions and conditional outcomes and that is amenable to analysis only at low levels. At high enough levels you’re basically back to pattern recognitions, ideally with some support from formal reasoning.
In any case, I’m not sure why do you think that teaching people to think in patterns will be hard or will lead to major jumps in productivity. People already do this, all the time. Essentially you are talking about unlearing the reliance on formalism which is applicable to very few.
The reference class that I’ve implicitly had in mind in writing my post is mathematicians / LWers / EAs, who do seem to think in the way that I had been. See my post Many weak arguments and the typical mind.
People outside of this reference generally use implicit statistical models that are not so great. For such people, the potential gains come from learning how to build much better implicit statistical models (as I did as a result of my exposure to data science.) I don’t know whether learning more advanced statistics would work for you personally—but for me, it was what I needed. Historically, most people who have very good implicit statistical models seem to have learned by observing others who do. But it can be hard to get access to them (e.g. I would not have been able to connect with Greg Jensen, Holden’s former boss, during my early 20′s, as Holden did.)
Mathematicians, yes, but that’s kinda natural because people become good mathematicians precisely by the virtue of being very good at formal reasoning. But I don’t know about LW/EA in general—I doubt most of them have “mathematical minds”.
People outside of this reference generally use implicit statistical models that are not so great.
Really? Math geeks/LW/EA are the creme de la creme, the ultimate intellectual elite? I haven’t noticed. “Normal” people certainly don’t have great thinking skills. But there is a very large number of smart and highly successful people who are outside of your reference class. They greatly outnumber the math/LW/EA crowd.
But I don’t know about LW/EA in general—I doubt most of them have “mathematical minds”.
Within the LW cluster I’ve seen a lot of focus on precision. It’s not uncommon for people in the community to miss the main points that I’m trying to make in favor of focusing on a single sentence that I wrote that seems wrong. I have seldom had this experience in conversation with people outside of the LW cluster: my conversation partners outside of the LW cluster generally hold my view: that it’s inevitably the case that one will say some things things that are wrong, and that it’s best to focus on the main points that someone is trying to make.
Really? Math geeks/LW/EA are the creme de la creme, the ultimate intellectual elite? I haven’t noticed. “Normal” people certainly don’t have great thinking skills. But there is a very large number of smart and highly successful people who are outside of your reference class. They greatly outnumber the math/LW/EA crowd.
By “generally” I meant “most people,” not “for a fixed person” – i.e. I don’t necessarily disagree with you.
Separately, I believe that a large fraction of transferable human capital is in fact in elite math and physics, but that’s a long conversation. My impression is that good physicists do use the style of thinking that I just learned. In the case of elite mathematicians, I think it would take like 5 years of getting up to speed with real world stuff before their strength as thinkers started to come out vividly.
An interesting post. You started with the assumption that formal reasoning is the right way to go and found out that it’s not necessarily so. Let me start from the opposite end: the observation that the great majority of people reason all the time by pattern-matching, this is the normal, default, bog-standard way of figuring things out.
You do not need to “retrain” people to think in patterns—they do so naturally.
Looking at myself, I certainly do think in terms of patterns—internal maps and structures. Typically I carry a more-or-less coherent map of the subject in my head (which certain areas being fuzzy or incomplete, that’s fine) and the map is kinda-spatial. When a new piece of data comes in, I try to fit it into the existing (in my head) structure and see if it’s a good fit. If it’s not a good fit, it’s like a pebble in a shoe—an irritant and an obvious problem. The problem is fixed either by reinterpreting the data and its implications, or by bending and adjusting the structure so there is a proper place for the new data nugget. Sometimes both happen.
Formal reasoning is atypical for me, that’s why I’m not that good at math. I find situations where you have enough hard data to formally reason about it to be unusual and rare (that would probably be different if I were an engineer or an accountant :-D). Most often you have stochastic reasoning with probability distributions and conditional outcomes and that is amenable to analysis only at low levels. At high enough levels you’re basically back to pattern recognitions, ideally with some support from formal reasoning.
In any case, I’m not sure why do you think that teaching people to think in patterns will be hard or will lead to major jumps in productivity. People already do this, all the time. Essentially you are talking about unlearing the reliance on formalism which is applicable to very few.
The reference class that I’ve implicitly had in mind in writing my post is mathematicians / LWers / EAs, who do seem to think in the way that I had been. See my post Many weak arguments and the typical mind.
People outside of this reference generally use implicit statistical models that are not so great. For such people, the potential gains come from learning how to build much better implicit statistical models (as I did as a result of my exposure to data science.) I don’t know whether learning more advanced statistics would work for you personally—but for me, it was what I needed. Historically, most people who have very good implicit statistical models seem to have learned by observing others who do. But it can be hard to get access to them (e.g. I would not have been able to connect with Greg Jensen, Holden’s former boss, during my early 20′s, as Holden did.)
Mathematicians, yes, but that’s kinda natural because people become good mathematicians precisely by the virtue of being very good at formal reasoning. But I don’t know about LW/EA in general—I doubt most of them have “mathematical minds”.
Really? Math geeks/LW/EA are the creme de la creme, the ultimate intellectual elite? I haven’t noticed. “Normal” people certainly don’t have great thinking skills. But there is a very large number of smart and highly successful people who are outside of your reference class. They greatly outnumber the math/LW/EA crowd.
Within the LW cluster I’ve seen a lot of focus on precision. It’s not uncommon for people in the community to miss the main points that I’m trying to make in favor of focusing on a single sentence that I wrote that seems wrong. I have seldom had this experience in conversation with people outside of the LW cluster: my conversation partners outside of the LW cluster generally hold my view: that it’s inevitably the case that one will say some things things that are wrong, and that it’s best to focus on the main points that someone is trying to make.
By “generally” I meant “most people,” not “for a fixed person” – i.e. I don’t necessarily disagree with you.
Separately, I believe that a large fraction of transferable human capital is in fact in elite math and physics, but that’s a long conversation. My impression is that good physicists do use the style of thinking that I just learned. In the case of elite mathematicians, I think it would take like 5 years of getting up to speed with real world stuff before their strength as thinkers started to come out vividly.
Of people worldwide, or of people reading this post? Considering the former leads to this failure mode.
Both.
Mathematicians are weird people, they think differently :-) I don’t think most of LW is mathematicians.
As a mathematician I can testify that even most mathematicians think in maps.