Every Cause Wants To Be A Cult
Cade Metz at The Register recently alleged that a secret mailing list of Wikipedia’s top administrators has become obsessed with banning all critics and possible critics of Wikipedia.1 Including banning a productive user when one administrator—solely because of the productivity—became convinced that the user was a spy sent by Wikipedia Review. And that the top people at Wikipedia closed ranks to defend their own.
Is there some deep moral flaw in seeking to systematize the world’s knowledge, of the sort that would lead pursuers of that Cause into madness? Perhaps only people with innately totalitarian tendencies would try to become the world’s authority on everything—
Correspondence bias alert! If the allegations about Wikipedia are true, they’re explained by ordinary human nature, not by extraordinary human nature.
The ingroup-outgroup dichotomy is part of ordinary human nature. So are happy death spirals and spirals of hate. A Noble Cause doesn’t need a deep hidden flaw for its adherents to form a cultish in-group. It is sufficient that the adherents be human. Everything else follows naturally, decay by default, like food spoiling in a refrigerator after the electricity goes off.
In the same sense that every thermal differential wants to equalize itself, and every computer program wants to become a collection of ad-hoc patches, every Cause wants to be a cult. It’s a high-entropy state into which the system trends, an attractor in human psychology. It may have nothing to do with whether the Cause is truly Noble. You might think that a Good Cause would rub off its goodness on every aspect of the people associated with it—that the Cause’s followers would also be less susceptible to status games, ingroup-outgroup bias, affective spirals, leader-gods. But believing one true idea won’t switch off the halo effect. A noble cause won’t make its adherents something other than human. There are plenty of bad ideas that can do plenty of damage—but that’s not necessarily what’s going on.
Every group of people with an unusual goal—good, bad, or silly—will trend toward the cult attractor unless they make a constant effort to resist it. You can keep your house cooler than the outdoors, but you have to run the air conditioner constantly, and as soon as you turn off the electricity—give up the fight against entropy—things will go back to “normal.”
On one notable occasion there was a group that went semicultish whose rallying cry was “Rationality! Reason! Objective reality!”2 Labeling the Great Idea “rationality” won’t protect you any more than putting up a sign over your house that says “Cold!” You still have to run the air conditioner—expend the required energy per unit time to reverse the natural slide into cultishness. Worshipping rationality won’t make you sane any more than worshipping gravity enables you to fly. You can’t talk to thermodynamics and you can’t pray to probability theory. You can use it, but not join it as an in-group.
Cultishness is quantitative, not qualitative. The question is not, “Cultish, yes or no?” but, “How much cultishness and where?” Even in Science, which is the archetypal Genuinely Truly Noble Cause, we can readily point to the current frontiers of the war against cult-entropy, where the current battle line creeps forward and back. Are journals more likely to accept articles with a well-known authorial byline, or from an unknown author from a well-known institution, compared to an unknown author from an unknown institution? How much belief is due to authority and how much is from the experiment? Which journals are using blinded reviewers, and how effective is blinded reviewing?
I cite this example, rather than the standard vague accusations of “scientists aren’t open to new ideas,” because it shows a battle line—a place where human psychology is being actively driven back, where accumulated cult-entropy is being pumped out. (Of course, this requires emitting some waste heat.)
This essay is not a catalog of techniques for actively pumping against cultishness. I’ve described some such techniques before, and I’ll discuss more later. Here I just want to point out that the worthiness of the Cause does not mean you can spend any less effort in resisting the cult attractor. And that if you can point to current battle lines, it does not mean you confess your Noble Cause unworthy. You might think that if the question were, “Cultish, yes or no?” that you were obliged to answer, “No,” or else betray your beloved Cause. But that is like thinking that you should divide engines into “perfectly efficient” and “inefficient,” instead of measuring waste.
Contrariwise, if you believe that it was the Inherent Impurity of those Foolish Other Causes that made them go wrong, if you laugh at the folly of “cult victims,” if you think that cults are led and populated by mutants, then you will not expend the necessary effort to pump against entropy—to resist being human.
1See “Secret Mailing List Rocks Wikipedia” (http://www.theregister.co.uk/2007/12/04/wikipedia_secret_mailing) and “Wikipedia Black Helicopters Circle Utah’s Traverse Mountain” (http://www.theregister.co.uk/2007/12/06/wikipedia_and_overstock).
2See “Guardians of the Truth” (http://lesswrong.com/lw/lz/guardians_of_the_truth) and “Guardians of Ayn Rand” (http://lesswrong.com/lw/m1/guardians_of_ayn_rand).
- Something to Protect by 30 Jan 2008 17:52 UTC; 213 points) (
- My tentative best guess on how EAs and Rationalists sometimes turn crazy by 21 Jun 2023 4:11 UTC; 194 points) (
- My tentative best guess on how EAs and Rationalists sometimes turn crazy by 21 Jun 2023 4:11 UTC; 160 points) (EA Forum;
- Ayn Rand’s model of “living money”; and an upside of burnout by 16 Nov 2024 2:59 UTC; 150 points) (
- Don’t Revere The Bearer Of Good Info by 21 Mar 2009 23:22 UTC; 126 points) (
- Unwitting cult leaders by 11 Feb 2021 11:10 UTC; 119 points) (
- Guardians of Ayn Rand by 18 Dec 2007 6:24 UTC; 118 points) (
- EA may look like a cult (and it’s not just optics) by 1 Oct 2022 13:07 UTC; 77 points) (EA Forum;
- The Craft & The Community—A Post-Mortem & Resurrection by 2 Nov 2017 3:45 UTC; 76 points) (
- You’re Calling *Who* A Cult Leader? by 22 Mar 2009 6:57 UTC; 67 points) (
- The Statistician’s Fallacy by 9 Dec 2013 4:48 UTC; 63 points) (
- Changing Your Metaethics by 27 Jul 2008 12:36 UTC; 62 points) (
- Guardians of the Truth by 15 Dec 2007 18:44 UTC; 56 points) (
- Causality and Moral Responsibility by 13 Jun 2008 8:34 UTC; 55 points) (
- The Thing That I Protect by 7 Feb 2009 19:18 UTC; 46 points) (
- 24 Dec 2019 9:41 UTC; 46 points) 's comment on Free Speech and Triskaidekaphobic Calculators: A Reply to Hubinger on the Relevance of Public Online Discussion to Existential Risk by (
- ...And Say No More Of It by 9 Feb 2009 0:15 UTC; 43 points) (
- The Value (and Danger) of Ritual by 30 Dec 2011 6:52 UTC; 43 points) (
- The Ideology Is Not The Movement by 5 Apr 2016 1:52 UTC; 39 points) (
- Framing Practicum: Incentive by 27 Aug 2021 19:22 UTC; 38 points) (
- What rationality failure modes are there? by 19 Jan 2024 9:12 UTC; 33 points) (
- 10 Dec 2010 12:25 UTC; 30 points) 's comment on How To Lose 100 Karma In 6 Hours—What Just Happened by (
- Declare your signaling and hidden agendas by 13 Apr 2009 12:01 UTC; 25 points) (
- 28 Sep 2020 5:13 UTC; 23 points) 's comment on Blog posts as epistemic trust builders by (
- 12 May 2011 15:42 UTC; 22 points) 's comment on Designing Rationalist Projects by (
- 27 Apr 2021 16:21 UTC; 19 points) 's comment on Let’s Rename Ourselves The “Metacognitive Movement” by (
- 16 Jun 2010 7:02 UTC; 19 points) 's comment on Open Thread June 2010, Part 3 by (
- 14 Dec 2009 18:47 UTC; 19 points) 's comment on A question of rationality by (
- 4 Jul 2019 9:30 UTC; 19 points) 's comment on Causal Reality vs Social Reality by (
- 27 Mar 2023 17:53 UTC; 16 points) 's comment on EA is three radical ideas I want to protect by (EA Forum;
- 9 May 2011 20:02 UTC; 15 points) 's comment on Building rationalist communities: lessons from the Latter-day Saints by (
- 29 Jan 2010 13:52 UTC; 14 points) 's comment on Logical Rudeness by (
- 3 Jun 2014 21:21 UTC; 13 points) 's comment on Open thread, 3-8 June 2014 by (
- 4 Jul 2011 13:45 UTC; 12 points) 's comment on An Outside View on Less Wrong’s Advice by (
- 10 May 2011 9:51 UTC; 11 points) 's comment on Holy Books (Or Rationalist Sequences) Don’t Implement Themselves by (
- Reflections on community building in the Netherlands by 2 Nov 2017 22:01 UTC; 10 points) (EA Forum;
- Cross-Cultural maps and Asch’s Conformity Experiment by 9 Mar 2016 0:40 UTC; 10 points) (
- 18 Jan 2010 21:02 UTC; 10 points) 's comment on Advice for AI makers by (
- 15 Mar 2009 6:36 UTC; 10 points) 's comment on Closet survey #1 by (
- 29 Nov 2009 2:48 UTC; 10 points) 's comment on A Nightmare for Eliezer by (
- 14 Sep 2013 20:48 UTC; 9 points) 's comment on Notes on Brainwashing & ‘Cults’ by (
- [SEQ RERUN] Every Cause Wants To Be A Cult by 21 Nov 2011 5:17 UTC; 8 points) (
- 29 Feb 2012 9:07 UTC; 7 points) 's comment on Mike Darwin on the Less Wrong intelligentsia by (
- 29 Apr 2011 4:50 UTC; 7 points) 's comment on Entropy and social groups by (
- Rationality Reading Group: Part J: Death Spirals by 24 Sep 2015 2:31 UTC; 7 points) (
- 4 Jul 2019 20:24 UTC; 7 points) 's comment on Causal Reality vs Social Reality by (
- 10 Jan 2023 21:01 UTC; 6 points) 's comment on Read The Sequences by (EA Forum;
- 8 Mar 2009 15:37 UTC; 6 points) 's comment on Simultaneously Right and Wrong by (
- 8 Sep 2013 17:22 UTC; 6 points) 's comment on Open thread, September 2-8, 2013 by (
- 25 Sep 2009 14:55 UTC; 5 points) 's comment on The utility curve of the human population by (
- 14 Jun 2010 21:53 UTC; 5 points) 's comment on Open Thread June 2010, Part 3 by (
- 10 Jul 2011 8:50 UTC; 4 points) 's comment on Experiment: Knox case debate with Rolf Nelson by (
- 6 Aug 2019 21:11 UTC; 4 points) 's comment on “Rationalizing” and “Sitting Bolt Upright in Alarm.” by (
- 27 Jan 2012 13:40 UTC; 4 points) 's comment on What’s going on here? by (
- 9 Dec 2010 15:59 UTC; 4 points) 's comment on Best career models for doing research? by (
- 28 Nov 2011 23:10 UTC; 3 points) 's comment on LW Philosophers versus Analytics by (
- 16 Dec 2013 12:12 UTC; 3 points) 's comment on Karma awards for proofreaders of the Less Wrong Sequences ebook by (
- 27 May 2012 9:59 UTC; 3 points) 's comment on I Stand by the Sequences by (
- 31 Jul 2012 23:12 UTC; 2 points) 's comment on Welcome to Less Wrong! (2012) by (
- 5 Aug 2008 0:09 UTC; 2 points) 's comment on Anthropomorphic Optimism by (
- ACX Dublin August Meetup by 7 Aug 2023 16:14 UTC; 2 points) (
- 7 Jan 2016 8:30 UTC; 2 points) 's comment on Open Thread, January 4-10, 2016 by (
- 14 Oct 2011 19:10 UTC; 2 points) 's comment on A few analogies to illustrate key rationality points by (
- 16 Nov 2011 20:28 UTC; 2 points) 's comment on Less Wrong/Rationality Symbol or Seal? by (
- 17 Oct 2015 20:30 UTC; 1 point) 's comment on To Spread Science, Keep It Secret by (
- 23 Nov 2011 19:30 UTC; 1 point) 's comment on Where do I most obviously still need to say “oops”? by (
- 12 May 2011 18:38 UTC; 1 point) 's comment on Designing Rationalist Projects by (
- 14 Jul 2009 2:53 UTC; 1 point) 's comment on Recommended reading for new rationalists by (
- 7 Dec 2010 23:32 UTC; 0 points) 's comment on Suspended Animation Inc. accused of incompetence by (
- 4 Jul 2014 20:23 UTC; 0 points) 's comment on Downvote stalkers: Driving members away from the LessWrong community? by (
- 16 Nov 2011 20:17 UTC; 0 points) 's comment on Less Wrong/Rationality Symbol or Seal? by (
- 31 Jul 2011 3:42 UTC; -1 points) 's comment on Beware The Believer, or a study in depth of recursion by (
- 1 Aug 2011 2:02 UTC; -1 points) 's comment on On the unpopularity of cryonics: life sucks, but at least then you die by (
- Subsuming Purpose, Part 1 by 10 Aug 2012 18:45 UTC; -1 points) (
- 30 Mar 2013 9:33 UTC; -3 points) 's comment on Explaining vs. Explaining Away by (
“Resist against being human” is an interesting choice of words. Surely, most people would not see that as a goal worth pursuing.
I am annoyed at many things that are part of being human. In no particular order, mortality, enjoying unhealthy foods, having to exercise to be fit, getting scared of things I know to be safe, becoming confrontational when someone is trying to help me with deeply held but wrong beliefs, poor intelligence and especially memory (other people say otherwise), pathetic mathematical abilities (takes longer than the blink of an eye to divide two 100 digit numbers), spending 1⁄3 of my time asleep, inability to communicate at more than about 0.005 kB/s, bleeding, requiring an environment with a narrow range of acceptable temperatures, atmospheres, and gravity, and I’m pretty sure I could fill up several pages with things I don’t like about being human.
What do you consider to be the “normal” level of intelligence/memory/communication bitrate? Why?
If you’re going to concern yourself with popularity contests, you might as well abandon this entire field of endeavor right now. What “most people see” is utterly irrelevant.
And yet it’s a true observation, and entirely relevant if you’re going to concern yourself with convincing other people to resist against being human.
I wonder if a Randian will pop up to deny any assertions of cultishness.
I liked your bit about science. I get tired of people saying “Science is a religion too” or some variant thereof, whether from Christians or global warming skeptics like Arnold Kling.
Science very much isn’t a religion. (At least, it’s not supposed to be. The whole initial point of the system was to get thinking divorced from religious attachment to old ideas so progress would be quicker.)
But there are very much people for whom it has become their religion. Just listen. Any time you hear somebody talking about “the science” as though the mere fact that scientists have said something makes it true, that’s religious thinking.
And it pops up all over the place. The climate change debate is a perfect example. Doesn’t matter which side you agree with normally, the mere fact that politicians and the public talk about “the consensus” and “the science” as though the universe gives a crap about what “the majority” of scientists think should worry you. Especially when you dig into it further and find that the first surveys of “scientists” done to establish it as the “consensus” view consisted primarily of researchers in other fields who you wouldn’t expect to know much more about the subject than the average guy on the street. But once “the consensus” is rolling, it’s darned hard to stop.
Is the “consensus” view correct? Hard to say. It definitely could be, but the way everyone has started shouting down all counterevidence (because “consensus”) makes it hard to tell.
It’s the same political and religious mistakes mankind has been making since the beginning, dressed up in the fancy, new suit of “science”.
Only that’s even worse, because now you have a religion which has been ripped loose from the last 2000 years worth of studying human nature that the world’s major religions had. And worse, hostile to anything and everything religious philosophers have ever learned. A new religion that not only throws the baby out with the bath water, but does so on purpose merely because it wasn’t theirs.
And every time someone insists that “science isn’t a religion” without making sure both sides of the conversation are talking about the same thing, they’re just feeding the beast and making it stronger. If we’re not careful, we’ll end up with a theocracy of “scientific management” with “experts” taking the place of priests, prophets, and gods all at once.
If Randians do pop up, the lady doth protest too much methinks.
This is a great post. Especially since it applies just as much to the cause of “overcoming bias.”
whose rallying cry was “Rationality! Reason! Objective reality!”
Not to disagree with your main point (I’ve seen cultishness even in mathematics, where we really do have objective reality), but aren’t those cults whose banner is Rationality in a better position than those who aren’t? They may be just as cultish on the inside, but they have publicly accepted a standard that makes then vulnerable to criticism they cannot just dismiss. Wouldn’t that make them a bit more honest?
Same point: are we more honest at overcoming bias, because we have a type of discourse that leaves us vulnerable to arguments of bias in ways we can’t ignore—or do we just become more skilled at rationalising?
That’s a question that everybody here needs to ask themselves every time they post, if they’re to fight the good fight against cult-entropy.
TGGP: Not to defend global warming denialism, but is that the entirety of your evidence that Kling is a denier? Because when I read that, I strongly got the impression that he was not rallying against people who believe that anthropogenic global warming is occuring, but rather against people preaching dogmatic versions of a nuanced science. I didn’t get the impression that he was saying that global warming scientists are preaching a religion, but that global warming activists are, and I think that’s completely reasonable. I mean, there’s a difference between calling out science and calling out activists: One thing to note is that Al Gore has been on the global warming beat since BEFORE there was a scientific consensus about it (at least as far as he tells it). I don’t want to go off on Al Gore too much, but that’s certainly the sign of a dogmatist (that is, believing something to be true before the world’s experts on the subject had come to a consensus about it). There were basically two dogmas on the issue, and if you picked randomly you’d have a 50% chance of being vindicated.
And I hate to have to reiterate this, but I’m afraid to be lumped in with global warming deniers because I am defending someone who is perceived to be one, but I do NOT find the denialist position compelling. I do however think that Kling makes a good point there (and a similar point to the one in this blog post, I might add) that it is important to convey how you know what you know. It might be reading too far into it, but I would say that that circles back to the point made in this post about cultishness: it’s easy to say that something “good” like trying to prevent climate disasters isn’t going to have those cultish aspects of attempting to suppress dissent and form in-group mentality, and it it is important (if you are interested in overcoming your biases) to work against this by quantifying how big of a cultish presence you have in your “good” cause.
Just to point it out, even the term “denialist” was designed to be a loaded word that biases everyone who hears it against the position. Which doesn’t make them any more or less likely to be correct, but it does let you know that the whole debate has gone political and scoring points against the opposing side has become more important than finding the truth.
Which doesn’t actually add any evidence to either side being correct, because the universe doesn’t really care about what we think, but it does tell you to watch out, because the mainstream voices have already picked which side they want to be correct and are ruthlessly filtering the “evidence” to eliminate all dissent. Perhaps the dissenters really don’t have a point, but if they did they’d be shouted down long before they could make it.
The I found the main points of the article interesting and fairly convincing but you seem to over-correct for correspondence bias when you say “If the allegations about Wikipedia are true, they’re explained by ordinary human nature, not by extraordinary human nature”. Even if normal human behaviour leads to cultishness, why assume that individual psychological quirks didn’t have a relevant effect in a specific case?
It’s certainly possible to overcompensate for the fundamental attribution error.
This is what I think happens when people say things like “Stalin was just a product of his circumstances.” No, he was a manipulative, sadistic psychopath; his circumstances are what made him a world leader and mass murderer instead of a corrupt banker or serial killer.
But in this case, I do think that the admins of Wikipedia are humans of at least normal—if not in fact above-average—moral character, falling prey to their circumstances. Their behavior does not seem SO extreme, SO cruel, that it can’t be fit with what we know about normal human beings.
TGGP, Kling wasn’t calling science a religion, he was calling the anti-global-warming movement a religion, which strikes me as true regardless of whether the skeptics are right about the science (I think they’re not).
So can we learn to recognize the sound of a “cult cooler”, cooling down the cultishness, and distinguish it from a fake recording of such? Or at least invent a cultometer, so we can check our cultempature?
There is a http://en.wikipedia.org/wiki/Crackpot_index
A cultometer is surely not too far off:
http://www.tariqnelson.com/2007/02/signs-you-might-be-in-a-cult/
Your second link is broken. In addition to the Internet archive I have posted a blog post inspired by some of my experiences with a cult, containing the article in its entirety for posterity.
I singled out Kling because I could remember several occasions where he used the phrase “the religion of global warming” or something like it and just linked to that post after some quick googling. Perhaps my memory is bad and he hasn’t used it that many times though.
In the same sense that every thermal differential wants to equalize itself, and every computer program wants to become a collection of ad-hoc patches, every Cause wants to be a cult. It’s a high-entropy state into which the system trends, an attractor in human psychology.
For this to be strictly true, there would have to be more cultish microstates, more possible cultish groups, than sane ones. Do you think that’s actually the case?
Aren’t those cults whose banner is Rationality in a better position than those who aren’t? They may be just as cultish on the inside, but they have publicly accepted a standard that makes then vulnerable to criticism they cannot just dismiss.
They can still dismiss it by redefining “rationality” to exclude the methods the attacker is using.
That thought occurred to me too, and then I decided that EY was using “entropy” as “the state to which everything naturally tends” But after all, I think it’s possible to usefully extend the metaphor.
There is a higher number of possible cultish microstates than non-cultish microstates, because there are fewer logically consistent explanations for a phenomenon than logically inconsistent ones. In each non-cultish group, rational argument and counter-argument should naturally push the group toward one describing observed reality. By contrast, cultish groups can fill up the rest of concept-space.
Nominull nailed it on the head, Eliezer. What are the human qualities worth amplifying, and what are the human qualities worth suppressing?
For myself, “cultishness” is definitely a human group failure mode.
To others, maybe “cultishness” is a comfortable state of being, like relaxing in a warm bath. (Partake in the vivid imagery of a group of nude people in a drowsy state soaking in their collective body-temperature urea solution...)
I assert that the choice of what elements of humanity are worthy, and what are unworthy, is completely personal and subjective. I would be interested in seeing the argument for the differentiation being objective. Is there an objective criteria for what elements of humanity are worthy, and what are unworthy?
A different point: You really demonstrate the value of blogging and independently developing a stable of ideas, and then being able to reference those ideas with terminology backed up by a hyperlink. I am constantly rereading your posts as you link back to them, and it is interesting and profitable.
We really need to figure out how to create more cultishness. If you could build a cult around known science, which happily describes everything in human experience, and spread it, you’d do more good in the world than “rationality” or “overcoming bias” ever could.
No. Part of the definition of a cult is an unquestionable dogma, which runs counter to the core ideas of science. Building a cult around known science (even if you understand the principles well enough to avoid engaging in cargo cult science) is going to slow progress.
Consider replacing “core ideas of science” with “core ideas of society” and I’ll wager that’s closer to the commonly-used meaning of “cult”.
Dropping in mid thread, but I think you parsed that differently than intended; I read it as saying that the notion of unquestionable dogma runs counter to the core ideas of science, not that the dogma itself must run counter to anything in order to be a cult.
Ah. Yeah, I may have parsed that one incorrectly, now that you mention it. Thanks for pointing that out.
“So can we learn to recognize the sound of a “cult cooler”, cooling down the cultishness, and distinguish it from a fake recording of such? Or at least invent a cultometer, so we can check our cultempature?”
We could just ask our perfect Baysian leaders. They know all and understand all.
Eliezer’s reminder that even rationalists are human, and so are subject to human failings such as turning a community into a cult, is welcome. But it’s a big mistake to dismiss explanations such as “Perhaps only people with innately totalitarian tendencies would try to become the world’s authority on everything.” There is a huge degree of heterogeneity across people in every relevant metric, including a tendency toward totalitarianism. I can’t imagine that anyone disputes this. And if the selection process for being in a certain position tends to advantage people with those tendencies, so that they are selected into them, that might well explain a large part of how people in those positions behave.
Or at least invent a cultometer, so we can check our cultempature?
It’s a bad sign if we develop identifiable cliques. Because of general attitudes it stands to reason agreements and disagreements won’t be randomly distributed, but ideally we shouldn’t “agree” or “disagree” with others because we agreed or disagreed with them in the past. It probably wouldn’t be too hard to develop some sort of voting software that measured cliquishness if there’s a demand for it.
Of course, the real disaster would be if people start saying things like “Eliezer is always right”. Nobody is always right.
“Of course, the real disaster would be if people start saying things like “Eliezer is always right”. Nobody is always right.
Posted by: George Weinberg | December 12, 2007 at 03:49 PM ”
George,
That understates the risk, since self-identified rationalists familiar with the literature would concoct much better rationalizations. For instance, someone might say that “Nick Bostrom is very intelligent, actively works to overcome biases, and seems to have been relatively successful at it. Since almost all top academics are not immersed in the heuristics and biases literature and committed to Overcoming Bias, in a sustained dispute between top academics and Nick Bostrom we should expect the latter to be right much more often than a random high quality academic dissenter,” but then treat this as license to accept Bostrom’s positions on an improbable number of independent disagreements.
And even if Bostrom is always right, repeating back what Bostrom says may not mean that you have acquired any of Bostrom’s beliefs. Works great for ingroup identification though.
I strongly endorse this post. I’ve actually watched it happen: Groups dedicated to secularism or the Singularity or even rationality itself can degrade into cliques, evolve into tribes, and then ultimately become as much cults as their greatest foes.
It’s interesting to stumble across old references to authors whose names you only recognize now, but didn’t at the time. Cade Metz, huh? I wonder what he’s been up to lately!
For others who also havent heard of Cade Metz: he seems to be a news reporter (for the lack of a better word) writing mostly about AI. see https://www.nytimes.com/by/cade-metz.
I feel confused about the distinction between ingroup behavior and cult-like behavior. It makes sense to me that groups would, as a default, regress towards ingroup behavior without a force pulling them away from it. But it doesn’t seem accurate to say that they will naturally move towards cult-like behavior.
Maybe cult-like behavior is similar to ingroup behavior. We only label things as cults when the behavior gets extreme enough/far enough along the spectrum. And so, maybe it is accurate to say that groups naturally move towards cult-like behavior.
But even so, the equilibrium point surely isn’t anything close to an “actual” cult. Ingroup behavior can certainly be very powerful, but rarely powerful enough to cause you to given away all your property in anticipation of the saucers landing.
Relevant post given the situation with Leverage.
Seems relevant in the wake of the FTX scandal. I’ve seen people blaming effective altruism for the scandal, as if it FTX’s fraudulent practices prove that the philosophy of giving to charities that demonstrably do the most good is flawed. Even if the entire EA movement is cult-like and misguided, that doesn’t mean that the principle it’s based on is wrong. I think the modern EA movement is misguided to some extent, but only because they have misjudged which causes are the most effective, and this shouldn’t stop anyone else from donating to causes that they believe are more effective.
Interestingly, I do think the EA movement/philosophy is fundamentally misguided, but for very different reasons than their critics think.
That’s not interesting to read unless you say what your reasons are and they differ from other critics’. Perhaps not say it all in a comment, but at least a link to a post.