[SEQ RERUN] A Failed Just-So Story
Today’s post, A Failed Just-So Story was originally published on 05 January 2008. A summary (taken from the LW wiki):
Part of the reason professional evolutionary biologists dislike just-so stories is that many of them are simply wrong.
Discuss the post here (rather than in the comments to the original post).
This post is part of the Rerunning the Sequences series, where we’ll be going through Eliezer Yudkowsky’s old posts in order so that people who are interested can (re-)read and discuss them. The previous post was Rational vs Scientific Ev-Psych, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.
Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day’s sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.
This selection pressure only exists if religion is already a universal in the society. If Ugg, from three huts over, says that everyone in the tribe must believe in his imaginary friend, or else he will kill them, the selection that actually happens will work against Ugg. If religion arose essentially as an accident (like as a result of anthropomorphizing nature), then EY’s proposed selection mechanism could cement it. But if the first impetus towards religion was an evolutionary adaptation, then we need some other rationale to explain it.
Does anyone know of any alternative hypotheses for the rise of religion?
I’m not sure if this is actually true. The idea of religion is appealing enough that Ugg may be able to sell it to them. After all, Christian missionaries often succeed in displacing an old religion.
Besides, is the naïve argument really incorrect? The reason group selection fails is that individual selection is stronger. But if, by chance, religion grows to dominance in a tribe, then becoming irreligious is no longer an individual fitness gain. The two selection pressures point in the same direction.
The question is why are human brains wired such that this is the case?
The Just-So-Story I tell (while stating up front that it is a Just-So story) is this:
Assuming this is true, (I don’t know how to test for it) then once the basic ideas are in place, I think they’d be subject to memetic selection pressures that include “the person telling the stories becomes higher status for being a good story teller” which means people trust them more and other related things.
The Machiavellian intelligence hypothesis suggests that this is entirely the reason for our species’ level of intelligence. Hence chemistry professors using expressions like “electrons want to get to a lower energy state”, as if electrons have desires of their own. That sentence makes a lot more intuitive sense to a human brain than “electrons just sort of always do this. Because of math.”
Most people (especially in very simple social environments, which is where this whole thing started) hold opinions for approximately good reasons. This is less true of very complex or infrequently occurring issues.
Nevertheless, individual opinions do constitute a certain degree of evidence. It strikes me as very likely that a mechanism for accepting second-hand, unverifiable information would provide a fairly substantial evolutionary benefit to a hunter-gatherer. For that matter, it provides some benefit to us today.
Religion is also a good source of individual motivation, if you don’t happen to have an elaborated system of reasoning and ethics and metaethics handy. People tell stories about events (not all of them religious in nature) to contextualize them and predict them. Though that doesn’t necessarily explain their tendency to spread in and of itself.
As MinibearRex points out Eliezer’s “explanation” of how religion formed, fails to actually explain anything. Heck the “failed” just-so-story is a better explanation for how religion formed then the one Eliezer proposes in its place. I think this is a case of Eliezer being mind-killed by his hatred of religion, and thus choosing explanations based on how bad a light they cast religion in rather than any measure of their plausibility.