Recently, I was talking to a friend who hadn’t seen me in a while. They mentioned that my hair had grown noticeably, and then asked whether my hair grew fast or slow. I said that my hair growth was probably around average, but upon consideration, I realized that statement was so divorced from reality it would be a disservice to the notion of truth to even call it “false”. Not only do I not know how I compare to the average, I have no idea what the average even is.
It seems that people (myself included) make this kind of statement a lot, even when it’s obviously not backed by concrete evidence. So what’s going on here?
I: Analysis
When people (myself included) say that “I’m more/less [X] than average”, where [X] is anything that isn’t trivially measurable, they’re not using ‘average’ to mean any statistical notion of an average. Rather, they’re referring to their own mental generic-template-model of somebody who is “average at [X]”.
The first problem with this is that this template person isn’t necessarily representative of any empirical measure of the average. It’s a Frankenstein’s monster cobbled together out of personal experience, half-remembered data, anecdotes from the internet, fictional ‘evidence’, and who knows what else. It doesn’t have zero correlation to reality, but it’s nowhere near 100%.
The second problem is that your mental model is not my mental model. Your idea of average is not the same as my idea of it. Once again, it’s not going to be maximally dissimilar, but I’d expect that there would be significant variance between people if you asked them to describe what their mental model of “average at [X]” actually entails.
So when somebody says, for example, they’re an above-average driver, you can’t conclude anything about their driving skills until you also know what they think an average driver is. Yet when you hear them say that, you automatically match those words to your idea of an average driver, and develop a preconception of their driving skills that is only tangentially rooted in evidence.
II: Which average?
Having explained this, what would it take to make a statement like this and have it actually have a truth value? The most important difference between an evidence-based average and the mental model ‘average’ is the existence of a data set to calculate the average from. In other words, what group are you comparing yourself against?
The global average, your national average, your city average, your social group average, etc., could all be wildly different. This is especially important if you’re comparing yourself to ‘the average’ as part of a judgement, because if you think you’re comparing yourself to one average but your mental model is using another, you could be judging in the wrong direction entirely.
A while ago, my friends and I were talking about our hobbies, and I mentioned I felt kind of bad for being below-average in my writing output. After some discussion, it was brought up that this friend group seems to have abnormally high creative output compared to other people in our socioeconomic demographic—it contains a semi-professional artist of moderate internet fame, the entirety of a cover band that performs at local events, and more.
I had formed my mental model of “average creative output” primarily based off this friend group, but I was using that model as if it was representative of the average hobbyist. After disentangling this typical-friend-group error, I concluded that although it didn’t feel worth my effort to find out the real average, I was quite sure that it was lower than I previously thought, so I should feel less bad about my (lack of) writing output.
Maybe you feel bad about not being as smart as your friends, who are ‘obviously’ normal people (but due to some weird filter bubble are actually all geniuses). Or maybe you think you’re great at cooking, but your ‘average’ is your parents, whose idea of a fine dining is putting an egg in their instant noodles. Think about which average you profess to be comparing against, and check if that’s the same average you’re in fact basing your mental model on. And if you can, go out and look up what the real average for your target data set is.
III: Conclusion
Is this real? (AKA am I ironically typical-minding after writing an entire post railing against a specific case of typical-minding?) I’m pretty sure it is, unless everybody except me goes through the mental effort of grounding their words in evidence when they say “I’m a terrible cook, way below average”.
Is this new? The more I write about this, the more it sounds like a case study of the typical-mind fallacy. But I think the most important part of knowing about cognitive biases is being able to notice and correct them in real life, and this was a successful exercise in that.
Is this useful? It helped me realize I have massive knowledge gaps about what being average at X actually means for many values of X, and to think about which average I mean when I made comparisons.
″… than average” is (almost) meaningless
Recently, I was talking to a friend who hadn’t seen me in a while. They mentioned that my hair had grown noticeably, and then asked whether my hair grew fast or slow. I said that my hair growth was probably around average, but upon consideration, I realized that statement was so divorced from reality it would be a disservice to the notion of truth to even call it “false”. Not only do I not know how I compare to the average, I have no idea what the average even is.
It seems that people (myself included) make this kind of statement a lot, even when it’s obviously not backed by concrete evidence. So what’s going on here?
I: Analysis
When people (myself included) say that “I’m more/less [X] than average”, where [X] is anything that isn’t trivially measurable, they’re not using ‘average’ to mean any statistical notion of an average. Rather, they’re referring to their own mental generic-template-model of somebody who is “average at [X]”.
The first problem with this is that this template person isn’t necessarily representative of any empirical measure of the average. It’s a Frankenstein’s monster cobbled together out of personal experience, half-remembered data, anecdotes from the internet, fictional ‘evidence’, and who knows what else. It doesn’t have zero correlation to reality, but it’s nowhere near 100%.
The second problem is that your mental model is not my mental model. Your idea of average is not the same as my idea of it. Once again, it’s not going to be maximally dissimilar, but I’d expect that there would be significant variance between people if you asked them to describe what their mental model of “average at [X]” actually entails.
So when somebody says, for example, they’re an above-average driver, you can’t conclude anything about their driving skills until you also know what they think an average driver is. Yet when you hear them say that, you automatically match those words to your idea of an average driver, and develop a preconception of their driving skills that is only tangentially rooted in evidence.
II: Which average?
Having explained this, what would it take to make a statement like this and have it actually have a truth value? The most important difference between an evidence-based average and the mental model ‘average’ is the existence of a data set to calculate the average from. In other words, what group are you comparing yourself against?
The global average, your national average, your city average, your social group average, etc., could all be wildly different. This is especially important if you’re comparing yourself to ‘the average’ as part of a judgement, because if you think you’re comparing yourself to one average but your mental model is using another, you could be judging in the wrong direction entirely.
A while ago, my friends and I were talking about our hobbies, and I mentioned I felt kind of bad for being below-average in my writing output. After some discussion, it was brought up that this friend group seems to have abnormally high creative output compared to other people in our socioeconomic demographic—it contains a semi-professional artist of moderate internet fame, the entirety of a cover band that performs at local events, and more.
I had formed my mental model of “average creative output” primarily based off this friend group, but I was using that model as if it was representative of the average hobbyist. After disentangling this typical-friend-group error, I concluded that although it didn’t feel worth my effort to find out the real average, I was quite sure that it was lower than I previously thought, so I should feel less bad about my (lack of) writing output.
Maybe you feel bad about not being as smart as your friends, who are ‘obviously’ normal people (but due to some weird filter bubble are actually all geniuses). Or maybe you think you’re great at cooking, but your ‘average’ is your parents, whose idea of a fine dining is putting an egg in their instant noodles. Think about which average you profess to be comparing against, and check if that’s the same average you’re in fact basing your mental model on. And if you can, go out and look up what the real average for your target data set is.
III: Conclusion
Is this real? (AKA am I ironically typical-minding after writing an entire post railing against a specific case of typical-minding?) I’m pretty sure it is, unless everybody except me goes through the mental effort of grounding their words in evidence when they say “I’m a terrible cook, way below average”.
Is this new? The more I write about this, the more it sounds like a case study of the typical-mind fallacy. But I think the most important part of knowing about cognitive biases is being able to notice and correct them in real life, and this was a successful exercise in that.
Is this useful? It helped me realize I have massive knowledge gaps about what being average at X actually means for many values of X, and to think about which average I mean when I made comparisons.