If you have ever wondered how it is possible that a flying saucer cult has more members than EA, now it’s time to learn something.
One sentiment from a friend of mine that I don’t completely agree with but I believe is worth keeping in mind is that effective altruism (EA) is about helping others and isn’t meant to become a “country club for saints”. What does that have to do with Raelianism, or Scientology, or some other cult? Well, they tend to treat their members like saints, and their members aren’t effective. I mean, these organizations may be effective by one metric in that they’re able to efficiently funnel capital/wealth (e.g., financial, social, material, sexual, etc.) to their leaders. I’m aware of Raelianism, and I don’t know much about it. From what I’ve read about Scientology, it’s able to get quite a lot done. However, they’re able to get away with that when they don’t follow rules, bully everyone from their detractors to whole governments, and brainwash people into becoming their menial slaves. The epistemic hygiene in these groups are abysmal.
I think there are many onlookers from LessWrong who are hoping much of effective altruism gets better epistemics than they have now, and would be utterly aghast if they were selling this out to use whatever tools from the dark arts to make gains in raw numbers of self-identified adherents who cannot think or act for themselves. Being someone quite involved in EA, I can tell you that EA should grow as fast as possible, or that the priority is to make anyone who willing to become passionate about it to feel as welcome as possible, isn’t worth it if the expense is the quality culture of the movement, to the extent it has a quality culture of epistemic hygiene. So, sure, we could learn lessons from ufo-cults, but they would be the wrong lessons. Having as many people in EA as possible isn’t the most important thing for EA to do.
Being someone quite involved in EA, I can tell you that EA should grow as fast as possible, or that the priority is to make anyone who willing to become passionate about it to feel as welcome as possible, isn’t worth it if the expense is the quality culture of the movement, to the extent it has a quality culture of epistemic hygiene.
Sounds like effective is being cast aside for epistemically pure. EPA? Epistemically Pure Altruism.
If we look at mistakes of other groups and learn not to repeat them, it’s not a “wrong lesson”.
Also, I think you too easily assume that they don’t do anything that belongs to the light side. The whole trick in creating a group like that is, I guess, mixing the dark side stuff in between enough of the good stuff that it becomes hard to notice.
This may not be the best example, but what if you can get better PR for EA by choosing the more attractive people to do the public speaking, where’s the harm in that? That’s how human psychology likely works.
One sentiment from a friend of mine that I don’t completely agree with but I believe is worth keeping in mind is that effective altruism (EA) is about helping others and isn’t meant to become a “country club for saints”. What does that have to do with Raelianism, or Scientology, or some other cult? Well, they tend to treat their members like saints, and their members aren’t effective. I mean, these organizations may be effective by one metric in that they’re able to efficiently funnel capital/wealth (e.g., financial, social, material, sexual, etc.) to their leaders. I’m aware of Raelianism, and I don’t know much about it. From what I’ve read about Scientology, it’s able to get quite a lot done. However, they’re able to get away with that when they don’t follow rules, bully everyone from their detractors to whole governments, and brainwash people into becoming their menial slaves. The epistemic hygiene in these groups are abysmal.
I think there are many onlookers from LessWrong who are hoping much of effective altruism gets better epistemics than they have now, and would be utterly aghast if they were selling this out to use whatever tools from the dark arts to make gains in raw numbers of self-identified adherents who cannot think or act for themselves. Being someone quite involved in EA, I can tell you that EA should grow as fast as possible, or that the priority is to make anyone who willing to become passionate about it to feel as welcome as possible, isn’t worth it if the expense is the quality culture of the movement, to the extent it has a quality culture of epistemic hygiene. So, sure, we could learn lessons from ufo-cults, but they would be the wrong lessons. Having as many people in EA as possible isn’t the most important thing for EA to do.
Sounds like effective is being cast aside for epistemically pure. EPA? Epistemically Pure Altruism.
If we look at mistakes of other groups and learn not to repeat them, it’s not a “wrong lesson”.
Also, I think you too easily assume that they don’t do anything that belongs to the light side. The whole trick in creating a group like that is, I guess, mixing the dark side stuff in between enough of the good stuff that it becomes hard to notice.
This may not be the best example, but what if you can get better PR for EA by choosing the more attractive people to do the public speaking, where’s the harm in that? That’s how human psychology likely works.