It’s not an agenda in the sense of a political agenda (though it does have some connections to political ideas), nor a conspiracy, nor a consciously intended and promoted agenda.
But, they have a bunch of unconscious ideas—a particular worldview—which informs how they approach their research and, because they do not use the rigor of science which prevents such things, their worldview/agenda biases all their results.
The proper rigor of science includes things like describing the experimental procedure in your paper so mistakes can be criticized and it can be repeated without introducing unintended changes, and having a “sources of error” section where you discuss all the ways your research might be wrong. When you leave out standard parts of science like those, and other more subtle ones, you get unscientific results. The scientific method, as Feynman explained, is our knowledge about how not to fool ourselves (i.e. it prevents our conclusions from being based on our biases). When you don’t use it, you get wrong, useless and biased results by default.
One of the ways these paper goes wrong is it doesn’t pay enough attention to the correct interpretation of the data. Even if the data was not itself biased—which they openly admit it is—their interpretation would be A) problematic and B) not argued for by the data itself (interpretations of data never are argued for by the data itself, but must be considered as a separate and philosophical issue!)
If you try enough, you can get people to make mistakes. I agree with that much. But what mistake are the people making? That’s not obvious, but the authors don’t seriously discuss the matter. For example, how much of the mistake people are making is due to miscommunication—that they read the question they are asked as having a meaning a bit different than the literal meaning the researchers consider the one true meaning? The possibility that the entire phenomenon they were observing, or part of it, is an aspect of communication not biases about probability is simply not addressed. Many other issues of interpretation of the results aren’t addressed either.
They simply interpret the experimental data in a way in line with their biases and unconscious agendas, and then claim that empirical science has supported their conclusions.
It’s not an agenda in the sense of a political agenda (though it does have some connections to political ideas), nor a conspiracy, nor a consciously intended and promoted agenda.
But, they have a bunch of unconscious ideas—a particular worldview—which informs how they approach their research
Yes, I agree, and the ideas are not all unconscious either. What do you think the worldview is? I’m guessing the worldview has ideas in it like animals create knowledge, but not so much as people, and that nature (genes) influence human thought leading to biases that are difficult to overcome and to special learning periods in childhood. It’s a worldview that denies people their autonomy isn’t it? I guess most researchers looking at this stuff would be politically left, be unaware of good philosophy, and have never paid close attention to issues like coercion.
I think they would sympathize with Haldane’s “queerer than we can suppose” line (quoted in BoI) and the principle of mediocrity (in BoI).
There’s something subtle but very wrong with their worldview that has to do with the difference between problem finding and problem solving. These people are not bubbling with solutions.
A lot of what they are doing is excusing faults. Explaining faults without blaming human choices. Taking away our responsibility and our ability to be responsible. They like to talk about humans being influenced—powerless and controlled—but small and subtle things. This connects with the dominant opinion on Less Wrong that morality does not exist.
They have low standards. They know their “science” is biased, but it’s good enough for them anyway. They don’t expect, and strive for, better. They think people are inherently parochial—including themselves, who they consider only a little less so—and they don’t mind.
Morality can’t exist without explanations, btw, and higher level concepts. Strong empiricism and instrumentalism—as dominate Less Wrong—destroy it pretty directly.
They would not like Ayn Rand. And they would not like Deutsch.
Together, we explored the psychology of intuitive beliefs and choices and examined their bounded rationality.
They take for granted rationality is bounded and then sought out ways to show it, e.g. by asking people to use their intuition and then comparing that intuition against math—a dirty trick, with a result easily predictable in advance, in line with the conclusion they assumed in advance. Rationality is bounded—they knew that since college merely by examining their own failings—and they’re just researching where the bounds are.
Why did the jump to universality occur in our Western society and not elsewhere? Deutsch rejects the explanations of Karl Marx, Friedrich Engels and Jared Diamond that the dominance of the West is a consequence of geography and climate
That’s another aspect of it. It’s the same kind of thing. If you establish how biased we are, then our success or failure is dependent not on us—human ideas and human choices—but parochial details like our environment and whether it happens to be one our biases will thrive in or not.
It’s not an agenda in the sense of a political agenda (though it does have some connections to political ideas), nor a conspiracy, nor a consciously intended and promoted agenda.
But, they have a bunch of unconscious ideas—a particular worldview—which informs how they approach their research and, because they do not use the rigor of science which prevents such things, their worldview/agenda biases all their results.
The proper rigor of science includes things like describing the experimental procedure in your paper so mistakes can be criticized and it can be repeated without introducing unintended changes, and having a “sources of error” section where you discuss all the ways your research might be wrong. When you leave out standard parts of science like those, and other more subtle ones, you get unscientific results. The scientific method, as Feynman explained, is our knowledge about how not to fool ourselves (i.e. it prevents our conclusions from being based on our biases). When you don’t use it, you get wrong, useless and biased results by default.
One of the ways these paper goes wrong is it doesn’t pay enough attention to the correct interpretation of the data. Even if the data was not itself biased—which they openly admit it is—their interpretation would be A) problematic and B) not argued for by the data itself (interpretations of data never are argued for by the data itself, but must be considered as a separate and philosophical issue!)
If you try enough, you can get people to make mistakes. I agree with that much. But what mistake are the people making? That’s not obvious, but the authors don’t seriously discuss the matter. For example, how much of the mistake people are making is due to miscommunication—that they read the question they are asked as having a meaning a bit different than the literal meaning the researchers consider the one true meaning? The possibility that the entire phenomenon they were observing, or part of it, is an aspect of communication not biases about probability is simply not addressed. Many other issues of interpretation of the results aren’t addressed either.
They simply interpret the experimental data in a way in line with their biases and unconscious agendas, and then claim that empirical science has supported their conclusions.
Yes, I agree, and the ideas are not all unconscious either. What do you think the worldview is? I’m guessing the worldview has ideas in it like animals create knowledge, but not so much as people, and that nature (genes) influence human thought leading to biases that are difficult to overcome and to special learning periods in childhood. It’s a worldview that denies people their autonomy isn’t it? I guess most researchers looking at this stuff would be politically left, be unaware of good philosophy, and have never paid close attention to issues like coercion.
Yes.
I think they would sympathize with Haldane’s “queerer than we can suppose” line (quoted in BoI) and the principle of mediocrity (in BoI).
There’s something subtle but very wrong with their worldview that has to do with the difference between problem finding and problem solving. These people are not bubbling with solutions.
A lot of what they are doing is excusing faults. Explaining faults without blaming human choices. Taking away our responsibility and our ability to be responsible. They like to talk about humans being influenced—powerless and controlled—but small and subtle things. This connects with the dominant opinion on Less Wrong that morality does not exist.
They have low standards. They know their “science” is biased, but it’s good enough for them anyway. They don’t expect, and strive for, better. They think people are inherently parochial—including themselves, who they consider only a little less so—and they don’t mind.
Morality can’t exist without explanations, btw, and higher level concepts. Strong empiricism and instrumentalism—as dominate Less Wrong—destroy it pretty directly.
They would not like Ayn Rand. And they would not like Deutsch.
http://nobelprize.org/nobel_prizes/economics/laureates/2002/kahnemann-lecture.pdf
They take for granted rationality is bounded and then sought out ways to show it, e.g. by asking people to use their intuition and then comparing that intuition against math—a dirty trick, with a result easily predictable in advance, in line with the conclusion they assumed in advance. Rationality is bounded—they knew that since college merely by examining their own failings—and they’re just researching where the bounds are.
EDIT
http://www.timeshighereducation.co.uk/story.asp?sectioncode=26&storycode=415636
That’s another aspect of it. It’s the same kind of thing. If you establish how biased we are, then our success or failure is dependent not on us—human ideas and human choices—but parochial details like our environment and whether it happens to be one our biases will thrive in or not.