I think a person’s politics is a good indicator of how rational they are. Current research bears me out that authoritarians are more susceptible to motivated reasoning (the current term of art for confirmation bias). Chris Mooney makes an excellent case that epistemic closure is more prominent among conservatives than it is among liberals. Climate change denial, free market fundamentalism, and a broad assortment of conspiracy theories and paranoid delusions are rampant on the far right today. The left is relatively free of such hysteria.
So, I agree with this at a very weak level. The question is how good an indicator is this? For example, I know a very successful mathematician who has extreme right-wing politics, and another who has extreme left-wing politics. I know a linguist who is a monarchist. The fact is that humans can be highly rational in one area while extremely irrational in another. Look for example at how much of the left has extreme anti-nuclear power, anti-GMO and pro-alt med views that have little connection to evidence. The degree to which the left is “relatively free” has the word “relative” doing a lot of work in that sentence. Moreover, Moldbug’s views don’t fit into a standard notion of far-right.
Another issue to point out is that the studies which show a difference between left-wing and right-wing cognition are to a large extent limited: The differences in populations are quite small. Moreover, by other metrics, conservatives have more science knowledge than liberals on average. In fact, the GSS data strongly suggests that in general the most stupid, ignorant people are actually the political moderates. They have lower average vocab, and on average perform more poorly at answering basic science questions.
I don’t think a royalist follower of von Mises has anything interesting to say. Those who would admire such even less so.
So I’m deeply confused by this statement. You seem to be asserting that “Person X who says A will be extremely unlikely to have anything useful to say.” And asserting that “If Person Y thinks that Person X has interesting things to say about B despite X’s declaration of A, that makes the person Y even less likely to have useful things to say?” I’m curious, if we had a Person Z who pointed out that Y had interesting thing to say about issue C, would Z become even further less useful to listen to?
“The fact is that humans can be highly rational in one area while extremely irrational in another.”
Really? How do you know that? Why shouldn’t it be true that someone who is deeply wrong about one thing would not also be wrong about another? Your counter argument is a common fallacy. I am referring to studies in which a population is tested for whatever it is the study is looking for. You, like so many others these days, counter by saying: “I knew this one guy, he wasn’t like that so your study must be wrong.” You are correct that global warming is true regardless of the politics of the person. However the reverse is not true. The politics one has are strong indicators of how likely it is one holds beliefs that are not true.
There is in fact what is called the “smart idiot” effect. Conservatives who are better educated tend to be MORE wrong than their less educated base because they have more resources to bring to bear in rationalizing their fears. This is all about fear you know. Certain people react very fearfully to change. Like changing ideas about marriage for example. They then marshal their intellectual abilities to defend their emotional priors. The fact they can do so eloquently changes nothing.
--
“Moreover, by other metrics, conservatives have more science knowledge than liberals on average.”
So in responding to scientific studies that show differences between how authoritarians and liberals process data you cite… what? a blog? I am guessing that the blog you consider most relevant is that of Razib Kahn.
Razib poses the question “are conservatives more scientifically literate than liberals?” Well that is a different question isn’t it? Furthermore the questions in his database search do not test for scientific literacy. They test for conformity. Which I am more than willing to admit conservatives would perform better at. If I repeat the social norm that astrology is unscientific do I have “more science knowledge” than someone who does not? Or am I simply aping the values of my tribe and signaling I am a beta male in good standing?
Liberals would predictably adopt scientific ideas outside the norm because they are interested in them and it is exciting to explore the new or odd for it’s own rewards. Just as for a conservative it is comforting to reaffirm consensus beliefs. Both personalities are rewarded for their behavior. One for seeking out the new, the other for conformity to authority. Both are necessary for any healthy society. However, conservative personalities have a greater need for epistemic closure and are therefore more susceptible to a self validating reality bubble.
Which is what we see today on the right in the US.
--
“In fact, the GSS data strongly suggests that in general the most stupid, ignorant people are actually the political moderates.”
As Razib himself says “The Audacious Epigone did not control for background variables.”
--
“You seem to be asserting that “Person X who says A will be extremely unlikely to have anything useful to say.” And asserting that “If Person Y thinks that Person X has interesting things to say about B despite X’s declaration of A, that makes the person Y even less likely to have useful things to say?”″
Because the acolyte is always less than the master.
I prefer to cut Gordian knots rather than spend my days trying to untie them. So if it is true that Moldbug is a royalist and admires the fascist dictator Generalissimo Franco (who is still dead) then he is low on my stack of “people I should give a shit about”. Any followers even less so because they can’t even be original about who’s boots they should lick.
Ezra Pound was a great poet and likewise a fascist and admirer of Spain’s Franco. But poetry is art and while I might be able to set aside my political opinions to make room for Pound I would not consider anything he said outside of that to be of great value. There have been many artists who held political views I find repugnant and there have been many of history’s monsters who created artifacts of great beauty. The Samurai lords of feudal Japan created works of great beauty by night and literally hacked their peasants into bits by day. But art is one thing about which it is impossible to have “wrong” opinions about.
I have to have a filter. If I do not have one I will spend all my time pursuing false trails and diving into rabbit holes that go nowhere. So… in my first reply in this thread I clicked on the first link to Molbug’s pretentious twaddle on how he was going to teach people “true” economic theory. It was very kind of him in my view to make it clear from the beginning that he had no interest at all in economics as a science. So… someone who makes a thought error that bad, who thinks you can dictate what is true about economics, how likely is it that such a person would make the same thinking error in other disciplines? I think the odds are quite good. I did read a bit more before I closed the tab and he does seem to have a way with words. So.… there’s that… I guess.
If one wishes to understand a topic my advice is to go to any University bookstore and get an undergraduate textbook and read it. The odds are it is likely to be… wait for it… less wrong than some crank on the internet who thinks the academic world is conspiring against him. PLOP! Into the dustbin of history they go.
In economics that book will be Principles of Economics by N. Gregory Mankiw. It WON’T be some crackpot libertarian theory or the latest dribblings from the Austrian school. Why? Because Utopian systems are not about describing what is (and therefore they cannot be about what could be). They are about creating a bubble to insulate oneself from the big bad world. Yes yes it is harsh, reality is truly frightening. It may well be that we have set into motion events that will lead to our extinction. When I was young it was the threat of nuclear war. Today it is the possibility of a global extinction event due to climate change. Perhaps tomorrow it will be a killer asteroid. But denial and retreat are not solutions.
Really? How do you know that? Why shouldn’t it be true that someone who is deeply wrong about one thing would not also be wrong about another? Your counter argument is a common fallacy.
You should read the material linked to from this LW wiki article on Compartmentalization.
My other reply got very long and this matter was essentially tangential so I’ve broken this off into a separate comment.
Furthermore the questions in his database search do not test for scientific literacy. They test for conformity. Which I am more than willing to admit conservatives would perform better at. If I repeat the social norm that astrology is unscientific do I have “more science knowledge” than someone who does not?
This seems to be more about word games than anything else. If someone believes that the Earth is round but they don’t know why that’s commonly accepted, they have a fact about the universe, and one that if they think hard enough about it, one that probably pays rent. That they got to that result by “conformity” is both not obviously testable, and isn’t relevant in this context. Understanding that astrology doesn’t work is a perfect example of scientific knowledge. Moreover, I’m not completely sure what you mean by conformity. For example, I’ve never personally tested whether astrology works or not. Is it conformity to accept the broad set of scientific papers showing that it doesn’t work?
By the way, you can quote on less wrong by putting a “>” at the beginning of a paragraph. So if I write “> this” I get:
this
Moving on:
“The fact is that humans can be highly rational in one area while extremely irrational in another.”
Really? How do you know that? Why shouldn’t it be true that someone who is deeply wrong about one thing would not also be wrong about another? Your counter argument is a common fallacy. I am referring to studies in which a population is tested for whatever it is the study is looking for. You, like so many others these days, counter by saying: “I knew this one guy, he wasn’t like that so your study must be wrong
No. That’s not the argument being made here. The argument being made is twofold: 1) Exceptions exist (which doesn’t contradict the statistical claim) and 2) The statistics are actually weak effects. But if you prefer, consider the following situation: In many parliamentary systems one has a wide variety of different political parties. Israel for example has 14 parties with representation in the Knesset. Almost any two parties agree on at least one issue, and disagree on a variety of issues. That means that if a party is correct about all issues, then there have to be a large number (or even a majority) of people who are correct about that issue but wrong on many other issues. Even in a system like the US, people have a variety of different views and don’t fall into two strict camps in many ways (here again is somewhere where the GSS data is worth looking at), so the claim that people are across the board irrational or rational just doesn’t make sense.
There is in fact what is called the “smart idiot” effect. Conservatives who are better educated tend to be MORE wrong than their less educated base because they have more resources to bring to bear in rationalizing their fears.
Sure, this is likely the cause of some of what is going on here, especially in regards to global warming. Moreover, more educated people are more likely to know what their own tribe is generally expected to believe and adjust their views accordingly.
“Moreover, by other metrics, conservatives have more science knowledge than liberals on average.”
So in responding to scientific studies that show differences between how authoritarians and liberals process data you cite… what? a blog? I am guessing that the blog you consider most relevant is that of Razib Kahn.
I’m citing GSS data which happens to be discussed in more detail at a certain set of blogs. Note that the GSS data is freely availalble so you can easily verify the claims yourself. Note also that phrasing this question as “authoritarian” v. “liberal” is even more misleading than your earlier statement about authoritaianism. The data in question is explicitly about self-identification as liberal or conservative, not about any metric of authoritarianism. Indeed, many viewpoints that are classically seen as “conservative” or “right-wing” are anti-authoritarian. For example, free market economics is a right-wing viewpoint.
“In fact, the GSS data strongly suggests that in general the most stupid, ignorant people are actually the political moderates.”
As Razib himself says “The Audacious Epigone did not control for background variables.”
Yes, and there are actually fascinating things that occur when you try to. If you control in the GSS for income and education for example then self-identified liberals outperform self-identified conservatives. But that’s not terribly relevant: the question here is given someone’s political orientation, what should you expect about their knowledge level and accuracy of world view across politics and other issues? The underlying causal issues are an interesting side-issue but don’t touch on the basic question.
“You seem to be asserting that “Person X who says A will be extremely unlikely to have anything useful to say.” And asserting that “If Person Y thinks that Person X has interesting things to say about B despite X’s declaration of A, that makes the person Y even less likely to have useful things to say?”″
Because the acolyte is always less than the master.
That doesn’t make any sense. You are essentially claiming that someone who says “That guy over there may be wrong about a lot of things but he may have a handful of valid points” is more wrong than the person believes all the wrong points. Essentially, this claim amounts to saying that being open to a possibility of a diamond in the coal is more irrational than thinking the coals are all diamonds. Do you see the problem?
I have to have a filter.
Sure, I think reading Moldbug is generally a waste of time and wouldn’t recommend people to read him. But that’s not the issue that we’re discussing. Scroll back up a bit. The issue that started this subthread was the claim that some people on LW thinking that Moldbug might be worth paying attention to meant that there was something deeply wrong with Less Wrong as a whole. That’s the context that’s relevant here (and in that context most of the rest of your comment isn’t germane).
So, I agree with this at a very weak level. The question is how good an indicator is this? For example, I know a very successful mathematician who has extreme right-wing politics, and another who has extreme left-wing politics. I know a linguist who is a monarchist. The fact is that humans can be highly rational in one area while extremely irrational in another. Look for example at how much of the left has extreme anti-nuclear power, anti-GMO and pro-alt med views that have little connection to evidence. The degree to which the left is “relatively free” has the word “relative” doing a lot of work in that sentence. Moreover, Moldbug’s views don’t fit into a standard notion of far-right.
Another issue to point out is that the studies which show a difference between left-wing and right-wing cognition are to a large extent limited: The differences in populations are quite small. Moreover, by other metrics, conservatives have more science knowledge than liberals on average. In fact, the GSS data strongly suggests that in general the most stupid, ignorant people are actually the political moderates. They have lower average vocab, and on average perform more poorly at answering basic science questions.
So I’m deeply confused by this statement. You seem to be asserting that “Person X who says A will be extremely unlikely to have anything useful to say.” And asserting that “If Person Y thinks that Person X has interesting things to say about B despite X’s declaration of A, that makes the person Y even less likely to have useful things to say?” I’m curious, if we had a Person Z who pointed out that Y had interesting thing to say about issue C, would Z become even further less useful to listen to?
“The fact is that humans can be highly rational in one area while extremely irrational in another.”
Really? How do you know that? Why shouldn’t it be true that someone who is deeply wrong about one thing would not also be wrong about another? Your counter argument is a common fallacy. I am referring to studies in which a population is tested for whatever it is the study is looking for. You, like so many others these days, counter by saying: “I knew this one guy, he wasn’t like that so your study must be wrong.” You are correct that global warming is true regardless of the politics of the person. However the reverse is not true. The politics one has are strong indicators of how likely it is one holds beliefs that are not true.
There is in fact what is called the “smart idiot” effect. Conservatives who are better educated tend to be MORE wrong than their less educated base because they have more resources to bring to bear in rationalizing their fears. This is all about fear you know. Certain people react very fearfully to change. Like changing ideas about marriage for example. They then marshal their intellectual abilities to defend their emotional priors. The fact they can do so eloquently changes nothing.
--
“Moreover, by other metrics, conservatives have more science knowledge than liberals on average.”
So in responding to scientific studies that show differences between how authoritarians and liberals process data you cite… what? a blog? I am guessing that the blog you consider most relevant is that of Razib Kahn.
Razib poses the question “are conservatives more scientifically literate than liberals?” Well that is a different question isn’t it? Furthermore the questions in his database search do not test for scientific literacy. They test for conformity. Which I am more than willing to admit conservatives would perform better at. If I repeat the social norm that astrology is unscientific do I have “more science knowledge” than someone who does not? Or am I simply aping the values of my tribe and signaling I am a beta male in good standing?
Liberals would predictably adopt scientific ideas outside the norm because they are interested in them and it is exciting to explore the new or odd for it’s own rewards. Just as for a conservative it is comforting to reaffirm consensus beliefs. Both personalities are rewarded for their behavior. One for seeking out the new, the other for conformity to authority. Both are necessary for any healthy society. However, conservative personalities have a greater need for epistemic closure and are therefore more susceptible to a self validating reality bubble.
Which is what we see today on the right in the US.
--
“In fact, the GSS data strongly suggests that in general the most stupid, ignorant people are actually the political moderates.”
As Razib himself says “The Audacious Epigone did not control for background variables.”
--
“You seem to be asserting that “Person X who says A will be extremely unlikely to have anything useful to say.” And asserting that “If Person Y thinks that Person X has interesting things to say about B despite X’s declaration of A, that makes the person Y even less likely to have useful things to say?”″
Because the acolyte is always less than the master.
I prefer to cut Gordian knots rather than spend my days trying to untie them. So if it is true that Moldbug is a royalist and admires the fascist dictator Generalissimo Franco (who is still dead) then he is low on my stack of “people I should give a shit about”. Any followers even less so because they can’t even be original about who’s boots they should lick.
Ezra Pound was a great poet and likewise a fascist and admirer of Spain’s Franco. But poetry is art and while I might be able to set aside my political opinions to make room for Pound I would not consider anything he said outside of that to be of great value. There have been many artists who held political views I find repugnant and there have been many of history’s monsters who created artifacts of great beauty. The Samurai lords of feudal Japan created works of great beauty by night and literally hacked their peasants into bits by day. But art is one thing about which it is impossible to have “wrong” opinions about.
I have to have a filter. If I do not have one I will spend all my time pursuing false trails and diving into rabbit holes that go nowhere. So… in my first reply in this thread I clicked on the first link to Molbug’s pretentious twaddle on how he was going to teach people “true” economic theory. It was very kind of him in my view to make it clear from the beginning that he had no interest at all in economics as a science. So… someone who makes a thought error that bad, who thinks you can dictate what is true about economics, how likely is it that such a person would make the same thinking error in other disciplines? I think the odds are quite good. I did read a bit more before I closed the tab and he does seem to have a way with words. So.… there’s that… I guess.
If one wishes to understand a topic my advice is to go to any University bookstore and get an undergraduate textbook and read it. The odds are it is likely to be… wait for it… less wrong than some crank on the internet who thinks the academic world is conspiring against him. PLOP! Into the dustbin of history they go.
In economics that book will be Principles of Economics by N. Gregory Mankiw. It WON’T be some crackpot libertarian theory or the latest dribblings from the Austrian school. Why? Because Utopian systems are not about describing what is (and therefore they cannot be about what could be). They are about creating a bubble to insulate oneself from the big bad world. Yes yes it is harsh, reality is truly frightening. It may well be that we have set into motion events that will lead to our extinction. When I was young it was the threat of nuclear war. Today it is the possibility of a global extinction event due to climate change. Perhaps tomorrow it will be a killer asteroid. But denial and retreat are not solutions.
You should read the material linked to from this LW wiki article on Compartmentalization.
My other reply got very long and this matter was essentially tangential so I’ve broken this off into a separate comment.
This seems to be more about word games than anything else. If someone believes that the Earth is round but they don’t know why that’s commonly accepted, they have a fact about the universe, and one that if they think hard enough about it, one that probably pays rent. That they got to that result by “conformity” is both not obviously testable, and isn’t relevant in this context. Understanding that astrology doesn’t work is a perfect example of scientific knowledge. Moreover, I’m not completely sure what you mean by conformity. For example, I’ve never personally tested whether astrology works or not. Is it conformity to accept the broad set of scientific papers showing that it doesn’t work?
By the way, you can quote on less wrong by putting a “>” at the beginning of a paragraph. So if I write “> this” I get:
Moving on:
No. That’s not the argument being made here. The argument being made is twofold: 1) Exceptions exist (which doesn’t contradict the statistical claim) and 2) The statistics are actually weak effects. But if you prefer, consider the following situation: In many parliamentary systems one has a wide variety of different political parties. Israel for example has 14 parties with representation in the Knesset. Almost any two parties agree on at least one issue, and disagree on a variety of issues. That means that if a party is correct about all issues, then there have to be a large number (or even a majority) of people who are correct about that issue but wrong on many other issues. Even in a system like the US, people have a variety of different views and don’t fall into two strict camps in many ways (here again is somewhere where the GSS data is worth looking at), so the claim that people are across the board irrational or rational just doesn’t make sense.
Sure, this is likely the cause of some of what is going on here, especially in regards to global warming. Moreover, more educated people are more likely to know what their own tribe is generally expected to believe and adjust their views accordingly.
I’m citing GSS data which happens to be discussed in more detail at a certain set of blogs. Note that the GSS data is freely availalble so you can easily verify the claims yourself. Note also that phrasing this question as “authoritarian” v. “liberal” is even more misleading than your earlier statement about authoritaianism. The data in question is explicitly about self-identification as liberal or conservative, not about any metric of authoritarianism. Indeed, many viewpoints that are classically seen as “conservative” or “right-wing” are anti-authoritarian. For example, free market economics is a right-wing viewpoint.
Yes, and there are actually fascinating things that occur when you try to. If you control in the GSS for income and education for example then self-identified liberals outperform self-identified conservatives. But that’s not terribly relevant: the question here is given someone’s political orientation, what should you expect about their knowledge level and accuracy of world view across politics and other issues? The underlying causal issues are an interesting side-issue but don’t touch on the basic question.
That doesn’t make any sense. You are essentially claiming that someone who says “That guy over there may be wrong about a lot of things but he may have a handful of valid points” is more wrong than the person believes all the wrong points. Essentially, this claim amounts to saying that being open to a possibility of a diamond in the coal is more irrational than thinking the coals are all diamonds. Do you see the problem?
Sure, I think reading Moldbug is generally a waste of time and wouldn’t recommend people to read him. But that’s not the issue that we’re discussing. Scroll back up a bit. The issue that started this subthread was the claim that some people on LW thinking that Moldbug might be worth paying attention to meant that there was something deeply wrong with Less Wrong as a whole. That’s the context that’s relevant here (and in that context most of the rest of your comment isn’t germane).