I apologize for jumping to conclusions. This is sort of why I think getting into specifics is important. If you just make a vague hand-wavey ‘this might not be true’ dismissal of a claim you leave your interlocutor with little choice but to try and guess what your true objection is and so read too much into your comment.
If you just make a vague hand-wavey ‘this might not be true’ dismissal of a claim
This isn’t what I did. My criticism was fairly focused, with a fairly specific solution:
Whenever intelligence is mentioned in a comparative or quantitative way, care should be taken to indicate exactly which dimension of intelligence is being measured. [...] it would be sufficient to indicate the particular test that was being used.
The part that had you thinking I was dismissing the claim was probably this:
When it comes to experimental studies in social science and psychology, I always weight their result low compared to my own observations of a lifetime, because if I’ve observed anything, I’ve observed that things are complex, and I know we haven’t developed tools to handle this complexity.
It probably would have been wise to omit this sentence, since it caused so much bias about my intentions. My idea is that researchers do try to tackle complex subjects, like intelligence, and will measure something or do some experiment and report the results, but the interpretation or relevance of the study is all ‘spin’ in the Abstract or heavily dependent upon the reader’s lifetime experience to understand the relevance.
For example, what is “intelligence”? This is something that a group of researchers have to define, and have to measure in some way, in order to do their study and get it published. Consider the Dreary study. They’ve measured something and called it general intelligence. This part is the spin. However, when you look at how they defined “general intelligence”—this is a scientific paper; they do tell us, and they’re specific—it is patent that they didn’t include social intelligence, emotional intelligence or “street smarts” in this conception of intelligence. Requiring this clarification isn’t dismissing the study results, it’s just emphasizing that the context and the specifics are important.
I apologize for jumping to conclusions. This is sort of why I think getting into specifics is important. If you just make a vague hand-wavey ‘this might not be true’ dismissal of a claim you leave your interlocutor with little choice but to try and guess what your true objection is and so read too much into your comment.
This isn’t what I did. My criticism was fairly focused, with a fairly specific solution:
The part that had you thinking I was dismissing the claim was probably this:
It probably would have been wise to omit this sentence, since it caused so much bias about my intentions. My idea is that researchers do try to tackle complex subjects, like intelligence, and will measure something or do some experiment and report the results, but the interpretation or relevance of the study is all ‘spin’ in the Abstract or heavily dependent upon the reader’s lifetime experience to understand the relevance.
For example, what is “intelligence”? This is something that a group of researchers have to define, and have to measure in some way, in order to do their study and get it published. Consider the Dreary study. They’ve measured something and called it general intelligence. This part is the spin. However, when you look at how they defined “general intelligence”—this is a scientific paper; they do tell us, and they’re specific—it is patent that they didn’t include social intelligence, emotional intelligence or “street smarts” in this conception of intelligence. Requiring this clarification isn’t dismissing the study results, it’s just emphasizing that the context and the specifics are important.