Fundamental Attribution Error and The Other
I.
The fundamental attribution error is a well known psychological phenomenon wherein people are likely to ascribe external motivations to their own failure and external motivations for the failures of others. This has a variety of factors, and is a well studied concept, however I intend to focus on one aspect of this common problem. One explanation for the FAE is that it results from the simple truth that we know our own situations better than those of others, and fail to take into account the uncertainty in predicting the internal state of another person. This attempt to distill certainty out of uncertain information is a major flaw in many kinds of thinking, and has predictable consequences.
When we try to understand the internal motivations of other people and weigh them highly in comparison to circumstances, we often fit people into our narratives about the world in general. This is similar to ideas presented by Tetlock in that those who attribute evidence to sweeping large scale theories also tend to be more certain than they ought to be given the evidence available. In the case of individuals trying to understand each other, or the case of individuals trying to predict the world, both can fall prey to tendency to underestimate uncertainty and fill the gap between evidence and confidence with broad theories. In reality, often the things we are trying to understand are complex and strongly driven by factors outside of our model. This isn’t to say that having a model is bad, but that for a model to be good it must include an acceptance and awareness of the inherent uncertainty in the problem domain.
II.
However, there is another instance not yet mentioned that follows a similar pattern. If we are to say that interpreting the motivations of an individual is a difficult and uncertain endeavor, mustn’t we also say the same of groups of people, at least to a certain extent? Despite this, there is a strong tendency to explain the behaviors of groups with great confidence using highly simplistic models of their behavior and motivations.
From a Cracked article about Russian Who Wants to be a Millionaire:
In America, the audience viewed the million-dollar prize as a reason
to congratulate a fellow human on accomplishing something great. In
Russia, the audience viewed the million-ruple prize as a reason to
mourn the fact that there’s one less person to share the rampant
poverty and despair with.
That’s harsh, Russia.
This article takes the behavior of the audience during a game show, and states that Russian audiences are less enthusiastic because of deep seated jealousy as a result of collectivism and a communist heritage. While I won’t say this is impossible, it at least seems to be far from the simplest explanation, and relies more on preconceptions about Russian culture than any thorough analysis of motivations.
Even if such an analysis was carried out, to what extent would it be predictive given the subtleties and complexities involved in group behavior. If I were asked to predict the behavior of a Russian Who Wants to be a Millionaire audience, I might posit this as one outcome, but my confidence would only be in the high 50’s at most. That is to say that this outcome is not one that is obviously and necessarily a result of some deep seated Russian cultural values. Russia’s history could just as easily be used to predict that audiences will be more enthusiastic because collectivist societies have a greater understanding of the role luck plays in success or that their historically lower socioeconomic status will make them value the rewards more. Maybe Russian audiences are just less publicly emotional, or are less use to the role they are expected to play as an audience on a game show. Do most game show viewers at home stand up and cheer at the end? The actions of the American audience might just as much be a tradition of game show behavior as an expression of individualist values. At it’s heart, the article is trying to use simple post hoc reasoning to explain a complex real world scenario confidently, resulting in a harshly critical judgement about the values of the Russian populous.
This is not just one instance. I have heard similar arguments made saying that businesses in China are more cutthroat because of their Communist history and the lack of a Judeo-Christian moral tradition. Similarly, I have seen attempts to explain the behaviors of Americans as all the result of capitalism, or the broader tendency of philosophies like Marxism to attempt to explain nearly all human action as a result of capitalism. The core problem with all of these attempts is not even necessarily whether their premises are flawed, but with the high level of certainty with which they give their explanations. It’s fine to say that China is collectivist and therefore is likely to have a business environment less friendly to entrepreneurs, but in doing so one must also consider the thousands of other factors at play in determining the exact business climate of China. After that consideration, maybe you still come out right, but you should definitely come out less confident. Overconfidence is the bane of accuracy, and complexity should make us less confident, not more so.
III.
This tendency to be overconfident in our understanding of the other, be that other people on the street, or other nations across the world, has real consequences in how we deal with the world outside ourselves. If the expressions the audience has on a Russian game show is enough for you to condemn a whole people, you will find it easy to discount the ethics of any group you happen to dislike. When we ascribe these characteristics to other people, we grow the gap between them and us ever larger. Liberals support gun control because they all want to control the undesirables. Conservatives support genetic research because they are racist xenophobes. These attributes we proudly attribute to other groups in their entirety do nothing but misrepresent the truth and destroy our ability to understand each other instead of destroy each other. People are complicated, be they in singular or plural, and to forget that is to see a world in which there can only be coordination, and where Moloch reigns supreme.
I’m still primed by reading The Elephant in the Brain, so I want to put forward the straightforward Hansonian hypothesis that the fundamental attribution error is mostly a social strategy as opposed to mostly a cognitive error: if your goal is to make yourself look good and others look bad, of course you want to explain your successes and their failures as the result of intrinsic facts, which maximally increases your ally value and maximally decreases theirs. And you want to explain your failures and their successes as the result of extrinsic factors, which minimally decreases your ally value and minimally increases theirs.
(Keeping in mind social strategies being mostly unconscious, etc., I’m not trying to say that people are explicitly thinking like this)
I think I mostly agree, except to say that I think something can be both a social strategy and a cognitive error. The question is whether that social strategy actually leads to the outcome we want, if not then I think it’s fair to call it a cognitive error. Arguable all cognitive errors are social (or just general) strategies under certain conditions.
Well, the question of what “we want” means is also quite tricky, though.
I quite enjoyed this! I also fixed some formatting issues for you (your paragraphs were separated only by single-line breaks, as opposed to paragraph breaks).
Thank you, this is my first real piece of for fun analytical writing. I pulled this from my Medium, so looks like I need to check for errors more carefully next time.