The Temptation to Bubble
“Never discuss religion or politics.”
I was raised in a large family of fundamentalist Christians. Growing up in my house, where discussing politics and religion were the main course of life, the above proverb was said often—as an expression of regret, shock, or self-flagellation. Later, the experience impressed a deep lesson about bubbling up that even intelligent and rational people fall into. And I … I am often tempted, so tempted, to give in.
Religion and political identity were the languages of love in my house. Affirming the finer points of a friend’s identical values was a natural ritual, like sharing coffee or a meal together, and so soothing we attributed the afterglow to God himself. We can use some religious nonsense to illustrate, but please keep in mind, there’s a much more interesting point here than “certain religious views are wrong”.
A point of controversy was an especially excellent topic of mutual comfort. How could anyone else be *so* stupid as to believe we came from monkeys and monkeys came from *nothing*! that exploded a gazillion years ago, especially given all the young earth creation evidence that they stubbornly ignored. They obviously just wanted to sin and needed an excuse. Agreeing about something like this, you both felt smarter than the hostile world, and you had someone to help defend you against that hostility. We invented byzantine scaffolding for our shared delusions to keep the conversation interested and agree with each other in ever more creative ways. We esteemed each other, and ourselves, much more.
This safety bubble from the real world would allow denial of anything too painful. Losing a loved one to cancer? God will heal them. God mysteriously decided not this time? They’re in Heaven. Did your incredible stupidity lose you your job, your wife, your reputation? God would forgive you and rescue you from the consequences. You could probably find a Bible verse to justify anything you’re doing. Ironically, this artificial shell of safety, which kept us from ever facing the pain and finality reality often has, made us all the more fragile inside. The bubble became necessary to psychologically survive.
In this flow of happy mirror neuron dances, minor disagreements felt like a slap on the face. The shock afterward burned harder than a hand-print across the face.
25 years and, what seems like 86 billion light years of questioning, testing, and learning from that world-view, can see even beyond religion, people fall into bubbles so easily. The political conservatives only post articles from conservative blogs. The liberals post from liberal news sources. None have ever gone hunting on the opposing side for ways to test their own beliefs even once. Ever debate someone over a bill that they haven’t even read? All their info comes from the pravda wing of their preferred political party / street gang, none of it is first hand knowledge. They’re in a bubble.
Three of the most popular religions that worship the same God will each tell you the others are counterfeits, despite the shared moral codes, values, rituals and traditions. Apple fanboys who wholesale swallowed the lies about their OS / machines being immune from viruses, without ever having read one article of an IT security blog. It’s not just confirmation bias at work, people live in an artificial information bubble of information sources that affirm their identity, soothe their egos, and never test any idea that they have. Scientific controversies create bubbles no less. But it doesn’t even take a controversy, just a preferred source of information—news, blogs, books, authors. Even if such sources attempt to present an idea or argument from the others who disagree, they do not present it with sufficient force.
Even Google will gladly do this for you by customizing your search results by location, demographic, past searches, etc, to filter out things you may not want to see, providing a convenient invisible bubble for you even if you don’t want it!
If you’re rational, there’s daily work to break the bubbles by actually looking for ways to test the beliefs you care about. The more you care about them, the more they should be tested.
Problem is, the bigger our information sharing capabilities are, the harder it is to find quality information. Facebook propaganda posts get repeated over and over. Re-tweets. Blog reposts. Academic “science” papers that have never been replicated, but are in the news headlines everywhere. The more you actually dig into the agitprop looking for a few gems, or at least sources of interesting information, the more you realize even the questions have been framed wrongly, especially over controversial things. Without searching for high quality evidence about a thing, I resign myself to “no opinion” until I care enough to do the work.
And now you don’t fit in anyone’s bubble. Not in politics, not in religion, not even in technical arenas where people bubble up also. Take politics … it’s not that I’m a liberal and I miss the company of my conservative friends, or the other way around. Like the “underground man” I feel I actually understand the values and arguments from both sides, leading to wanting to tear the whole system apart and invent new ways or angles of addressing the problems.
But try to have a conversation, for example, about the trade-offs of huge military superiority the US has created: costs and murder vs eventually conceding dominance to who knows who, as they say—you either wear the merciless boot or live with it on your neck. Approaching the topic this way, and you may be seen as a weak peacenik who dishonors our hero troops or as a monster who gladly trades blood for oil; you’re not even understood as having no firm conclusion.
Okay, so don’t throw your pearls before swine you say. But you know, you’re going to have to do it quite a few times just to find out where the pig-pen ends and information close to the raw sources and unbiased data begin. If you want to hear interesting new ideas from other minds, you’re going to have to accept that they are biased and often come from inside their bubble. If you want to test your own beliefs, actively seek to disprove what you think, you will have to wade through oceans of bullshit and agitprop to find the one pearl that shifts your awareness. There is no getting around the work.
Then there are these kinds of situations: my father has also left the fundamentalist fold, but he has gone deeply into New Age mysticism instead of the more skeptical method I’ve taken. I really want to preserve our closeness and friendship. I know I can’t change his mind, but he really likes to talk about this stuff so to stay close I should really try hard to understand his perspective and ideas. But even asking to define terms like “higher consciousness” or explain experiences of “higher awareness” or try to understand the predictions about human “evolutionary” steps coming up … and he falls back to “it can’t be described” or “it’s beyond our present intelligence to grasp” or even “beyond rational thought” to understand. So I can artificially nod along not understanding a damn word about it, or I can try to get some kind of hook into his ideas and totally burst his bubble, without even trying. Bursting someone’s bubble is not cool. If you burst their bubble, they will cry. If only inwardly. Burst their bubble, and they will try to burst yours, not to help you but from pain.
Problem is, trying to burst your own bubble, you’re breaking everyone else’s bubbles left and right.
There is the temptation to seek out your own bubble just for temporary comfort … just how many skeptical videos about SpiritScience or creationism or religion am I going to watch? The scale of evidence is already tipped so far, investing more time to learn more details that nudge it 0.0001% toward 100% isn’t about anything other than emotional soothing. Emotional soothing is dangerous; it’s reinforcing my bubbles that I will now have to work all the harder to burst, to test, and to train myself to have no emotional investment in any provisional belief.
But it is so, so tempting, when you see yet another propaganda post for the republicrips or bloodocrat gang, vast scientific conspiracy posts, watch your friends and family shut down mid-conversation, so tempting to go read another Sagan book that teaches me nothing new but makes me feel good about my current provisional beliefs. It’s tempting to think about blocking friends who run a pravda outlet over facebook, or even shut down your facebook account. It’s tempting to give up on family in their own bubble and artificially nod along to concepts that have no meaning.
To some extent, I am even giving in by writing this … I would like to see many other rationalists feel the same way and affirm my perspective and struggle with this, and that reinforces my bubble, doesn’t it? There are probably psychological limits and needs that make some degree of it minimal. We’re compelled to eat, but if give ourselves over to that instinct without regard or care it will eventually kill us.
Don’t bubble, don’t give into the temptation, keep working to burst the bubbles that accrete around you. It’s exhausting, it’s painful, and it’s the only thing keeping your eyes open to reality.
And friend, as you need it here and there, come here and I’ll agree with you about something we both already have mountains of evidence for and almost none against. ;)
A good post. By the way, a common expression for what you call bubble is “echo chamber” and yes, one of the drawbacks of the internet is that it makes constructing them oh so very easy.
Thanks for pointing that out. Have heard that term before, but in the course of this stream of consciousness rant it just didn’t show up for the party.
There is a saying:
So the only surprise about the end of the post is that you were self-aware ;)
It seems to me that cognition is labor, and as basic economics suggests, labor can be specialized and there are gains from trade to be made. We should expect there to be organizations that tell people what to think, and so the question is what standards to hold those organizations to.
My first question would actually be what are the incentives for those organizations and what is in their interest for you to think.
You mentioned gains from trade—when you receive ideas as the fruit of the specialised labour, what do you pay in exchange?
Holding an entity to standards is, of course, an example of incentives faced by that entity. I prefer the additional specificity because it points out which set of incentives can be easily modified.
In general, some combination of social, political, and economic power—it’s a trade like any other, but weighted more heavily towards the social and political side of things due to the nature of ideas relative to other goods and services.
In the context I would call that a non-central example.
By whom? Not very likely by people targeted.
No, I don’t think so. In particular, one side tends to believe it’s getting things “for free” and not view it as a trade at all. I would suggest that the framework of influence/manipulation/memetics/control is going to be more useful here than the framework of free trade.
Suppose my Facebook friends share a terrible article (the one that comes to mind is one that came out a few years ago about male and female ELO distributions in chess). I can criticize my friends for sharing the article; I can criticize the authors for writing the article; I can criticize the editors for accepting the article. This shifts the social landscape slightly.
Focusing instead on ‘publish or perish’ seems like mistaking the proper scope of my actions, as I am not the Science Czar nor am I likely to become one.
Influence is a more specialized description of these sorts of information markets, yes. But the point of that paragraph is that it is not just empirically the case that people will listen to each other and adopt opinions with minimal original seeing, but that it is obviously rational for them to do so in almost all contexts. The language and models of specialization and trade make that more clear.
Assuming you notice the article is terrible.
Humans in general are pretty good at declaring things terrible, but I agree with the point that it’s nontrivial to claim that one’s judgment of terrible corresponds to useful principles.
I’m not sure I understand your claim. It is obviously rational in almost all contexts for people to adopt other people’s opinions?
Yes, for the exact same reason as why it is obviously rational for people to trade for almost all products that they use, rather than produce them themselves. (The “that they use” qualifier is material; I don’t mean that you should buy everything on the market, but that what you consume should predominantly come from the market.)
The user’s effort is primarily in distinguishing between competitors created by specialist suppliers, not creating things themselves, and when they do create things they rarely go too many steps up the production chain from consumption. One might bake bread themselves from flour, from example, but they likely do not mill their own flour, or grow their own wheat, or domesticate their own wheat, or invent eating, and so on.
Are you assuming starting from zero, from tabula rasa? That’s not really the usual case. And if not, people, basically, have priors which are often strong priors. It does (and it should) take considerable amount of evidence to overcome them.
If you mean that when interested in a particular question one should google it up instead of trying to derive the whole thing from scratch, then sure. But an example I had in my head looks different:
You acquired a bit of money and decided to invest it. So, let’s ask the experts. You look around and there is a Schwab/Fidelity/blah brokerage. You go in and talk to the guy in there who is an expert and clearly knows more than you do. He recommends putting your money into their proprietary Schwab/Fidelity/blah fund. You do the “obviously rational” thing, give him the money, and leave. Is there a problem?
This is basically what I mean. I’m trying to differentiate between what I’ll call the normal range of behavior (i.e. does this person seem like a do-it-yourselfer or not relative to other people I know) and the actual range of behavior (what fraction of the productive work done on things they consume does this person do themselves). If you focus just on the normal range, it’s easy to argue that one should do-it-themselves as much as possible, without realizing that the absolute level for “as much as possible” is closer to 2% than to 100%.
And if you start looking at things with the consumer mindset, then you start thinking about how to be a savvy consumer instead of how to be a producer, and I think that’s a very useful skillset to develop.
Yes—and it looks like the same problem as buying the first car you see for sale, and paying the asking price. (I’ll note that, as much of an index fund partisan as I am, I didn’t invent those arguments—I read them, they made sense to me, and I started doing it and repeating them).
It’s very good of you to say the writing is good, glad you enjoyed it, and yes will write more here.
Completely agree with you that liberal vs conservative is an overly dualistic and simplistic way to carve up political positions, but for brevity’s sake and to keep on point, described it that way.
Assuming everyone on this forum values the idea of testing their knowledge; not to prove or even disprove their ideas, but to update. probabilities. But why isn’t this method, even a dumbed-down version of it, held in higher regard for progress than debate? Debate is virtually useless to the general public. We already teach the scientific method, but only as applied to the school science fair, instead of a general method for getting to a clearer view of things.
You’re of course completely on the nose about people not having the time and energy to do the actual work on all the issues. So my advice: don’t be a moron. Say you have no opinion. Didn’t read the holy book (either your own or the enemy’s religion)? No opinion. Didn’t read the bill? No opinion. Read no articles from climate science journals? No opinion. Etc.
Because it’s not a good method for getting a clear view of things.
Except then your at the mercy of, at best, the people who ignore this advise, or at worst, the people who intentionally made things overly complicated in order to screw you.
For example, why can’t most people read the bill? Because the bill is extremely unnecessarily long. Why is the bill extremely unnecessarily long? The better for the lobbyists to hide all the ways they’re screwing you on behalf of their clients.
It looks like society needs some specialists who make a living interpreting these things , political journalists maybe.
Yes, and in particular, I can form my opinion based on what others write about it rather than having to say “no opinion” if I haven’t read the bill myself.
There is nothing in the post you linked to that supports your statement that the scientific method is “not a good method for getting a clear view of things”.
(What there is: Eliezer argues for calling things “scientific beliefs” only when they are generalizations endorsed by scientific study, rather than particular statements that follow from those generalizations; and for calling things “science” only when they are publicly known. None of that has any bearing on how well, or how widely, the scientific method is effective in distinguishing truth from error.)
Don’t think Eliezer meant to say that the scientific method isn’t awesome for optimizing a truthful view of reality. If he did say that, he’s wrong. Is there a specific case you could make on why it’s not, because didn’t get that from the article you referred to.
Don’t understand your comment about having no opinion when you have no data. I’m reading it as 1) many people won’t dig for data and have strong opinions anyway and 2) obscurity can be used as a weapon to prevent you from forming an informed opinion. Does that describe your comment accurately?
For 1, not sure what the disadvantage you see here … okay ignorant opinions are bountiful. So we should join the club or they’ll …. what? For 2, if the alternative is to form a strong a opinion without data because someone made it too much work for you to care that much, then they’ve manipulated you more than if you hold no opinion at all … what am I missing?
The scientific method has it’s uses, just as the court system has it’s uses. They both, however, rely on throwing out certain kinds of evidence. And one can’t always afford to ignore said evidence in practice.
For policy on the basis of their wrong ideas.
I didn’t say one shouldn’t use any data. Simply that one doesn’t have to read the bill to form an opinion about it.
Here are some hints:
Didn’t read the holy book (either your own or the enemy’s religion). =/= having no data about it
Didn’t read the bill. =/= having no data about it
Read no articles from climate science journals. =/= having no data about it
No, but what data I do have about it is likely to be filtered.
So? The point of the article is not that one should ignore filtered evidence, but that one should adjust for the filter.
I appreciate this comment for many reasons, but mostly because it throws into prominence the role of different values underlying comparisons like the top post’s.
I wish I had the kind of serene acceptance of other people that you seem to have, but I do not. I am inclined to blame people for not making time to research economic, social, and political policy options, since these things are so important. You’re right that it takes time to learn details about which policies are good and which are not, but there are many other factors besides knowledge that are relevant to sustained disagreement. For example, it’s not a matter of time investment for someone to admit it when they realize they are wrong, it’s essentially just a matter of integrity. Most people lack the humility to do this, however. This is repulsive to me, a mindset that values pretending to be right over actually figuring out how to help others. But this mindset is one I feel that most people possess. I strongly wish I believed otherwise, it’s very unpleasant for me to half-despise so many people, but it’s what my view of the facts suggests.
How much of these behaviors (lack of humility, intellectual laziness, etc) that repulse you are driven by evolutionary adaptations to living in a social group and maintaining your status and reputation in your tribe? Agreeing with the popular view in your tribe, and agreeing with tribal leaders to display loyalty, probably has some fitness advantages. Have no empirical data for that, but it’s worth considering as an alternate view, especially if you “strongly wish you believed otherwise” … humility and integrity may not get a chance to step up if higher priority instincts are kicking in to produce these effects.
I find LW to be a much healthier medium between sites like Reddit and 4chan for ideology. Its alright to not constantly burst people’s bubbles or your own especially if you are a regular on LW. People are not the ideology they spew, so it is important to deemphasize non-constructive ideological conversations/talks with people you want to or must interact with frequently. I will, however echo your struggle to not “bubble”, reaffirming your already held beliefs.
You should read The Big Sort by Bill Bishop, he talks about how in America we are literally and physically moving towards areas that favor our political and social ideas. This makes local control easy and national control impossible.
Uhh, why not just accept that you aren’t and can never be perfectly rational and use those facts in positive ways.
Bubbles are psychologically comforting and help generate communities. Rationalist bubbling (which ironically includes the idea that they don’t bubble) probably does more to build the community and correct other wrong beliefs than almost anything else.
Until and unless rationalist take over society the best strategy is probably just to push for a bubble that actively encourages breaking other (non-rationalist) bubbles.
“Religion and political identity were the languages of love in my house.”
I think this is an important thing to remember. People stay in their bubbles because bubbles are nice, and pleasant. Challenging cherished beliefs? That hurts. I think this is a well-enough documented concept on this site that I don’t need to link to sources.
I definitely spend time in my own echo chambers, but I’ve left several bubbles behind. Now that I’m on the outside looking in, it’s easy to get frustrated with people who cling to comfort. It’s easy to forget how good it feels to feel right. We get fuzzy feelings and real social benefits from being around people who agree with us. Frustration with them can lead to treating them unkindly, or being abrasive when we present our own reasoning. It’s also well-documented that when approached explicitly with a hostile viewpoint, people buckle down on their original position even harder, evidence be damned.
I can’t for the life of me find it, but there was an interesting article (pamphlet?) that talked about changing minds, using climate change as an example. People responded best (if I recall correctly) to articles that didn’t immediately present a list of for/against arguments. They simply laid out the facts in a cheerful, non-aggressive tone, with references to “common misconceptions” sprinkled throughout. I wish I could find the article.
I like that you don’t ignore the social benefit of these bubbles, especially in your last line. Fuzzy feelings are important for animals like us, and all the better if you can get that affirmation from trusted sources. At the very least, you can seek out people who have done the best they can to be unbiased. People who want to believe in truth, and then, because it feels so good, agree with each other about it. ;)