One of the great things about the internet is that there is a social group for almost every interest. Pick an unusual hobby or ideology, and there is probably an online community centered around it. This is especially wonderful for those of us who never quite fit in to mainstream society.
But there’s also a downside to this aspect of the internet, which is that the more we immerse ourselves in these small online communities, the less exposure we get to the rest of the world. And the less exposure we get to the rest of the world, the easier it is for us to hold onto false beliefs that the rest of the world rejects. (Of course, it’s also easier to hold onto true beliefs that the rest of the world rejects.)
For instance, suppose you believe that pasteurizing milk makes it less healthy, and we should all drink our milk raw. (I picked this example because it’s something decidedly non-mainstream that I believe with high probability.) I’m fairly susceptible to social pressures, so at least for me, my belief in this proposition goes up when I’m hanging out with intelligent people who agree with me, and it goes down when I’m hanging out with intelligent people who look at me like I’m insane when I claim such a thing. They don’t need to state evidence in either direction to influence my belief-probability, though that certainly helps. The important thing is that I think they’re smart and therefore I trust their opinions.
Unsurprisingly, if I spend most of my time hanging out with normal, intelligent, scientifically-minded Americans, I start to question my beliefs regarding raw milk, but if I spend all my time on raw-milk-promoting websites, then my belief that raw milk is good for us is reaffirmed.
We like having our beliefs affirmed; it makes us happy when other people think we are right about things. We’d rather seek out people who agree with us and can relate to our mindsets than seek out groups where everyone disagrees with us strongly. This is normal and reasonable, and it’s why all of us rationalists are hanging out here on LessWrong instead of lurking in creationist forums. However, it does put us at risk of creating feedback loops: unusual ideas are proposed by people we respect, we affirm those ideas, others repeat those ideas, and so their prevalence and repetition causes them to be repeated more. Many of those who disagree are hesitant to voice their disagreements for fear of rejection. As a result, LessWrong perpetuates many ideas that the rest of the world considers somewhat odd. Also, the rest of the world perpetuates many ideas that we at LessWrong consider extremely odd.
I’m not saying anything new here, I know. Everything I’ve written so far has been discussed to death on LessWrong, and if I were less lazy this article would be full of links to the sequences. If I recall correctly, the sequences recommend countering this problem by recognizing that we have these biases, and consciously trying to correct for them.
I try to do this, but I also tend to employ an additional solution to this problem. Because I recognize that I’m easily influenced by others’ beliefs, I make sure to expose myself to a myriad of different belief systems. For instance, in politics, I read blogs by liberal feminist scientists as well as conservative anti-feminist traditionalists. Since I respect the authors of all the blogs I read, and recognize that they are intelligent people who have thought deeply about their perspectives, I can’t easily dismiss either perspective outright as lies spoken by a moron. Since their beliefs differ so radically, I also can’t just fall into the trap of believing everything I read. So I’m forced to really think about the ideas, and question why their proponents believe them, and consolidate them (and other thoughts I might have) into my own coherent worldview.
Thus, I consider it important to be exposed to the ideas of people I disagree with. Meeting intelligent people who think differently than I do keeps my mind open, and reminds me that there are things about the world that I don’t know yet, and keeps me from overestimating the probability that my beliefs are true.
Unfortunately, search engines like Google are making it more difficult for me to do so. About a week ago, I attended a lecture on information retrieval, and I was shocked to find out exactly how much our Google searches are customized to our own preferences.
Suppose John and Mary both Google something like “creationism”. Now suppose that John is an atheist who reads a lot of atheist forums, and Mary is a fundamentalist Christian who spends most of her time on Christian forums. John’s Google results might contain a lot of links to people on his favorite atheist website talking about how much creationism sucks, and Mary’s Google results might contain a lot of links to her friends’ blogs talking about how God created the earth.
In this example, John and Mary are both having their beliefs reaffirmed, because Google is presenting them with things they want to hear. They will not be exposed to opposing viewpoints, and will be much less likely to change their minds about important issues. In fact, their beliefs in their own viewpoints will probably grow stronger and stronger each time Google gives them back these results, and they will become less and less aware that another viewpoint exists.
Of course, this might happen without Google filtering its search results. John might deliberately avoid reading the views of creationists, or dismiss them outright as moronic, or not ever Google anything that might lead him to their webpages, because he is convinced of his beliefs and would rather have them affirmed than contradicted. Since he would skip past the fundamentalist Christian blog results anyway, Google is doing him a service by ranking the stuff he cares about higher.
But at least for me, this Google filtering is a bad thing. I want to see other webpages which present other viewpoints, instead of being led back to the same places over and over again. And if Google doesn’t show them to me when I search for them, and I don’t realize that my Google search results are being customized, I might never realize there’s something I’m missing, or go to look for it.
I’m probably making this sound more dire than it actually is. Obviously, I can try other search terms, or just ignore websites I’ve already been to. Or I can follow links on other websites and wander off into regions of the internet without the help of Google. But I still have a visceral reaction against search engines customizing their results to fit my individual ideological preferences, because they are perpetuating my biases without giving me any direct control over which pieces of information I receive.
Are search engines perpetuating our biases?
One of the great things about the internet is that there is a social group for almost every interest. Pick an unusual hobby or ideology, and there is probably an online community centered around it. This is especially wonderful for those of us who never quite fit in to mainstream society.
But there’s also a downside to this aspect of the internet, which is that the more we immerse ourselves in these small online communities, the less exposure we get to the rest of the world. And the less exposure we get to the rest of the world, the easier it is for us to hold onto false beliefs that the rest of the world rejects. (Of course, it’s also easier to hold onto true beliefs that the rest of the world rejects.)
For instance, suppose you believe that pasteurizing milk makes it less healthy, and we should all drink our milk raw. (I picked this example because it’s something decidedly non-mainstream that I believe with high probability.) I’m fairly susceptible to social pressures, so at least for me, my belief in this proposition goes up when I’m hanging out with intelligent people who agree with me, and it goes down when I’m hanging out with intelligent people who look at me like I’m insane when I claim such a thing. They don’t need to state evidence in either direction to influence my belief-probability, though that certainly helps. The important thing is that I think they’re smart and therefore I trust their opinions.
Unsurprisingly, if I spend most of my time hanging out with normal, intelligent, scientifically-minded Americans, I start to question my beliefs regarding raw milk, but if I spend all my time on raw-milk-promoting websites, then my belief that raw milk is good for us is reaffirmed.
We like having our beliefs affirmed; it makes us happy when other people think we are right about things. We’d rather seek out people who agree with us and can relate to our mindsets than seek out groups where everyone disagrees with us strongly. This is normal and reasonable, and it’s why all of us rationalists are hanging out here on LessWrong instead of lurking in creationist forums. However, it does put us at risk of creating feedback loops: unusual ideas are proposed by people we respect, we affirm those ideas, others repeat those ideas, and so their prevalence and repetition causes them to be repeated more. Many of those who disagree are hesitant to voice their disagreements for fear of rejection. As a result, LessWrong perpetuates many ideas that the rest of the world considers somewhat odd. Also, the rest of the world perpetuates many ideas that we at LessWrong consider extremely odd.
I’m not saying anything new here, I know. Everything I’ve written so far has been discussed to death on LessWrong, and if I were less lazy this article would be full of links to the sequences. If I recall correctly, the sequences recommend countering this problem by recognizing that we have these biases, and consciously trying to correct for them.
I try to do this, but I also tend to employ an additional solution to this problem. Because I recognize that I’m easily influenced by others’ beliefs, I make sure to expose myself to a myriad of different belief systems. For instance, in politics, I read blogs by liberal feminist scientists as well as conservative anti-feminist traditionalists. Since I respect the authors of all the blogs I read, and recognize that they are intelligent people who have thought deeply about their perspectives, I can’t easily dismiss either perspective outright as lies spoken by a moron. Since their beliefs differ so radically, I also can’t just fall into the trap of believing everything I read. So I’m forced to really think about the ideas, and question why their proponents believe them, and consolidate them (and other thoughts I might have) into my own coherent worldview.
Thus, I consider it important to be exposed to the ideas of people I disagree with. Meeting intelligent people who think differently than I do keeps my mind open, and reminds me that there are things about the world that I don’t know yet, and keeps me from overestimating the probability that my beliefs are true.
Unfortunately, search engines like Google are making it more difficult for me to do so. About a week ago, I attended a lecture on information retrieval, and I was shocked to find out exactly how much our Google searches are customized to our own preferences.
Suppose John and Mary both Google something like “creationism”. Now suppose that John is an atheist who reads a lot of atheist forums, and Mary is a fundamentalist Christian who spends most of her time on Christian forums. John’s Google results might contain a lot of links to people on his favorite atheist website talking about how much creationism sucks, and Mary’s Google results might contain a lot of links to her friends’ blogs talking about how God created the earth.
In this example, John and Mary are both having their beliefs reaffirmed, because Google is presenting them with things they want to hear. They will not be exposed to opposing viewpoints, and will be much less likely to change their minds about important issues. In fact, their beliefs in their own viewpoints will probably grow stronger and stronger each time Google gives them back these results, and they will become less and less aware that another viewpoint exists.
Of course, this might happen without Google filtering its search results. John might deliberately avoid reading the views of creationists, or dismiss them outright as moronic, or not ever Google anything that might lead him to their webpages, because he is convinced of his beliefs and would rather have them affirmed than contradicted. Since he would skip past the fundamentalist Christian blog results anyway, Google is doing him a service by ranking the stuff he cares about higher.
But at least for me, this Google filtering is a bad thing. I want to see other webpages which present other viewpoints, instead of being led back to the same places over and over again. And if Google doesn’t show them to me when I search for them, and I don’t realize that my Google search results are being customized, I might never realize there’s something I’m missing, or go to look for it.
I’m probably making this sound more dire than it actually is. Obviously, I can try other search terms, or just ignore websites I’ve already been to. Or I can follow links on other websites and wander off into regions of the internet without the help of Google. But I still have a visceral reaction against search engines customizing their results to fit my individual ideological preferences, because they are perpetuating my biases without giving me any direct control over which pieces of information I receive.
What do you guys think?