Yes, Rationalism is a Cult

(I realize many people here are part of this cult, and it may be upsetting to hear that people consider it a cult. But I figure “No, Rationalism is Not a Cult” needs a response.)

My definition of ‘cult’ is close to

An ideology that employs anti-epistemology to convince you to support it.

The hard part of this definition comes with the word “anti-epistemology”. If a university math department teaches the ZFC set theory axioms, and asks their professors to use these axioms for formal proof verification rather than Type Theory (or another ideology), are they promoting anti-epistemology? As a constructivist, it seems they are promoting obviously bad beliefs, like an axiom of infinity, which leads to massive confusion around things like fractals, self-reference, and the universe at large[1].

The difficulty with declaring it anti-epistemic is that you cannot prove a system consistent from within the system. The best you can do is find an inconsistency. As far as we know, the ZFC ideology is internally consistent, so even if another ideology claims it promotes obviously bad beliefs, all we learn is the two ideologies are inconsistent with each other.

Most people find it difficult to hold inconsistent beliefs once they are aware of them, so most people would not be a ZFC set theorist and a constructivist at the same time. This is good, because it means ideologies that claim to have real-world consequences are being constantly tested when their members bump up against the real world. If a Young Earth Creationist learns about radioisotope dating, they may transition to a more modern version of Christianity. Of course, some of them just bite the bullet, but enough will reject the Young Earth that the ideology will not grow as fast as others (or even shrink), and soon be dominated by ideologies that mesh better with the real world[2].

As ideologies get bigger, it becomes more likely someone will discover an inconsistent belief, so the biggest ideologies have adapted to

  1. become more internally consistent, and

  2. make it less likely an inconsistency is brought to their believers’ awareness.

This second adaptation is what constitutes “anti-epistemology”, and makes an ideology a cult. If mathematicians discovered a contradiction in the ZFC axioms, it wouldn’t become “metaphorical” and a “still useful belief, even if not literally true”. It would get replaced, just like the set theory of the 1800s. Physicists are slightly more culty, and it has been said that the field advances one funeral at a time (Planck’s principle), but they are usually pretty good about spreading awareness when an inconsistency is found. They are not hiding the inconsistency between general relativity and quantum mechanics from their own members, let alone the public.

Mormonism is staunchly on the other side of the cultiness scale. It has about as many true believing members as the physicists, but retains them mostly by keeping them unaware of inconsistencies. Do their scriptures say that polygamy is an everlasting covenant that everyone must follow to get to the top-tier heaven? Why yes, but their teachings gloss over this, maybe with a phrase about being temporary or necessary for the times. The members who notice a little confusion are met with apologetics that somehow interpret the clause as its own negation. But of course, very few people get to that point, as they’re told that the internet is a very bad, anti-Mormon source of knowledge, and the proper channels for resolving questions is through prayer, the scriptures (just not that passage), and of course, church leaders. Most of the adaptations Mormonism makes—in how it teaches and what it teaches—are to make its members less likely to come across an inconsistency, and if they do, not realize it is an inconsistency.

Rationalism does this too, just not to the same degree. Here are a couple examples of adaptations the ideology has made that has the effect of hiding inconsistencies:

  1. LessWrong has a voting system where more established users get more votes. This is good, because you can build a reputation system, but also bad because part of that reputation is how closely you align with the ideology. Some ideas, like selfish egoism, just disappear in the downvotes, while others, like effective accelerationism, are mostly addressed in apologetica. Most LessWrongers are not even aware that sum-utilitarianism is not well-defined, and while I could steelman AI slowdown, I hardly know the arguments for e/​acc.

  2. Effective altruism is always introduced by drowning a child. Smarter and more honest inductees will report that, “I care about having nice things for myself, and then my friends and family, much more than a random child,” leave the room muttering, “cult,” and not show up to the next EA reading club. Less critical people will bite the bullet and say, “I guess I shouldn’t care about distance,” and less analytical people will mostly be emotionally distraught about the drowning children everywhere. There are also people that genuinely prefer others’ well-being over a marginal increase in theirs—mostly wealthy or ascetic folks—and I think this is the target audience of EA evangelism. However, a lot of people who don’t get caught with Singer’s thought experiment, and don’t recognize the inconsistency with their previously held beliefs.

While these adaptations have the effect of hiding some inconsistencies, this seems to be mostly a side-effect of otherwise useful adaptations. The LessWrong voting system is superior to others, even if it creates ideological cementation. Singer’s thought experiment is an easy introduction for effective altruism, even if it will convince some people to donate money in an inconsistent manner. Rationalism is a cult, but not intentionally[3].

Edit: As a couple comments pointed out, lots of groups employ anti-epistemology to get you to support them. Why should I care more when Rationalism does this than Safeway? It’s due to the degree of impact. If Safeway doesn’t actually have the best deals, my friends or I may lose $20. If earn-to-give and 80k hours isn’t actually consistent with my or my friends’ preferences, we may lose 20% of our wealth. Or, ideologically, Safeway’s claims are relegated to a small, peripheral thought you may have the next time you go grocery shopping, while Rationalism’s will dominate your decision-making process.


  1. Spacetime is a smooth manifold? How did you smooth it out? ↩︎

  2. From a solipsistic perspective, it’s a little tricky to define the “real world”, but let’s just go with the distribution of logically possible universes that your thoughts could arise from, weighted by the Kolmogorov complexity to reach your thoughts (e.g. the standard model + evolution takes fewer bits to describe a path to your thoughts than a Boltzmann brain). ↩︎

  3. The stimuli that created the culty adaptations wasn’t, “we’re bleeding members, what holes do we need to plug?” but tamer things like, “how do we increase the quality of posts people see?” or “how do we introduce this unintuitive idea to people who aren’t philosophy majors?” It makes sense to assign intent based on the problem being solved, so Rationalism is not intentionally culty, while Mormonism certainly is. ↩︎