Of these three, only love bombing was prominently present, but I think it was actually genuine feeling in the “low rank” members (and I’ve talked to quite a few). Of uniforms, I think they all had unassuming trinkets/medalions hung from their necks. Food sharing was not present and the mentioned lunch was not more special than any other lunches I went to with other groups of people.
I think there were quite a few other interesting factors at play, and that’s why I suggested we could all learn something from a discussion about it. I don’t have anything like a complete analysis ready myself, but I detected significant amounts of:
casual, friendly touch—there’s nothing wrong with it and I would like to see more of it in other communities,
making good use of good looking people and sexual attractiveness—subtle enough to not be pretentious,
telling people to think for themselves (but I guess on meta-level, you already know the conclusion you are supposed to think of),
scripted/planned/predictable ways to evoke positive emotions,
honesty about emotions and subjective experience, and full acceptance of this by other members,
generally positive atmosphere of friendliness and helpfulness that carries over to the rest of the world, not just the in-group,
excellent stage/public speaking/PR skills of the top ranking members,
mixing in genuinely worthy/morally sound cases, like stopping wars (but from LW point of view it looks mostly like cheering than taking action).
I guess it’s not easy to find a balance between manipulation, and the “reverse manipulation” where people sabotage themselves to signal that they are not manipulating; between focusing on impressions instead of substance, and not being aware of the impressions; between blindness towards biases, and ignoring human nature. Especially in a group where different people will have wildly different expectations for what is acceptable and what is cultish.
Sometimes it feels like a choice between losing rationality and losing momentum. Optimizing to never do anything stupid can make one never get anything done. (And there is of course the opposite risk, but that is much less likely among our kind, although we obsess about it much more.)
What you described here seems like a solid strategy for getting new members. But then the question is how to prevent diluting the original goals of the group. I mean, instead of people who care deeply about X, you succeed to recruit many random strangers because they will feel good at your group. So how do you make sure that the group as a whole (now having a majority of the people who came for the good feelings, not for X) will continue to focus on X, instead of just making their members feel good?
I think the usual answer is strict hierarchy: the people at the top of the organization, who decided that X is the official goal, will remain at the top; there is no democracy, so even if most people actually care about feeling good, they are told by the bosses to do X as a condition for their staying at the group where they feel good. And only carefully vetted new members, who really contribute to X, are later added to the elite.
So, if rationalists or effective altruists would embrace this strategy, they would need to have a hierarchy, instead of being just a disorganized mob. So instead of “rationalists in general” or “effective altruists in general”, there would have to be a specific organization, with defined membership and leadership, who would organize the events. Anyone could participate at the events, but that wouldn’t make them equal to the leaders.
For example, for rationalists, CFAR could play this role. You could have thousands of people who identify as “rationalists”, but they would have no impact on the official speeches by CFAR. But CFAR is an organization specialized on teaching rationality; so it would be better to have some other organization to serve as an umbrella organization for the rationalist movement—to contain people who are mutually believed to rationalists, even if they don’t participate at developing a curriculum.
Similarly for effective altruism. You need a network of people who share the values of the movement, who provide “credentials” to each other, and who only accept new people who also credibly demonstrated that they share the values.
You are not wrong of course, but on the scale between “between blindness towards biases, and ignoring human nature” your views fall 80%-90% towards “ignoring human nature”.
Just to give you a more complete image, here’s a thought:
People are not consequentialists, and they don’t know clearly what they want.
In fact, there is nothing “absolute” that tells us what we “should” want.
And the happier you make people, the happier they tend to be with the goals you give them to work on.
If you also teach people rationality, you will get more scrutiny of your goals, but you will never get 100% scrutiny. As a human, you are never allowed to fully know what your goals are.
Looking at the “human nature” side, people who “care deeply” about EA-style things are just people who were in the right situation to “unpack” their motivations in a direction that is more self-consistent than average, not people who fundamentally had different motivations.
So my “human nature” side of this argument says: you can attract people who are in it “just for feeling good”, give them opportunity to grow and unpack their inner motivations, and you’ll end up with people who “care deeply” about your cause.
Intentional Insights is actually working actively on a project of making the EA movement more welcoming, let me know if you are interested in collaborating on this project.
Of these three, only love bombing was prominently present, but I think it was actually genuine feeling in the “low rank” members (and I’ve talked to quite a few). Of uniforms, I think they all had unassuming trinkets/medalions hung from their necks. Food sharing was not present and the mentioned lunch was not more special than any other lunches I went to with other groups of people.
I think there were quite a few other interesting factors at play, and that’s why I suggested we could all learn something from a discussion about it. I don’t have anything like a complete analysis ready myself, but I detected significant amounts of:
casual, friendly touch—there’s nothing wrong with it and I would like to see more of it in other communities,
making good use of good looking people and sexual attractiveness—subtle enough to not be pretentious,
telling people to think for themselves (but I guess on meta-level, you already know the conclusion you are supposed to think of),
scripted/planned/predictable ways to evoke positive emotions,
honesty about emotions and subjective experience, and full acceptance of this by other members,
generally positive atmosphere of friendliness and helpfulness that carries over to the rest of the world, not just the in-group,
excellent stage/public speaking/PR skills of the top ranking members,
mixing in genuinely worthy/morally sound cases, like stopping wars (but from LW point of view it looks mostly like cheering than taking action).
I guess it’s not easy to find a balance between manipulation, and the “reverse manipulation” where people sabotage themselves to signal that they are not manipulating; between focusing on impressions instead of substance, and not being aware of the impressions; between blindness towards biases, and ignoring human nature. Especially in a group where different people will have wildly different expectations for what is acceptable and what is cultish.
Sometimes it feels like a choice between losing rationality and losing momentum. Optimizing to never do anything stupid can make one never get anything done. (And there is of course the opposite risk, but that is much less likely among our kind, although we obsess about it much more.)
What you described here seems like a solid strategy for getting new members. But then the question is how to prevent diluting the original goals of the group. I mean, instead of people who care deeply about X, you succeed to recruit many random strangers because they will feel good at your group. So how do you make sure that the group as a whole (now having a majority of the people who came for the good feelings, not for X) will continue to focus on X, instead of just making their members feel good?
I think the usual answer is strict hierarchy: the people at the top of the organization, who decided that X is the official goal, will remain at the top; there is no democracy, so even if most people actually care about feeling good, they are told by the bosses to do X as a condition for their staying at the group where they feel good. And only carefully vetted new members, who really contribute to X, are later added to the elite.
So, if rationalists or effective altruists would embrace this strategy, they would need to have a hierarchy, instead of being just a disorganized mob. So instead of “rationalists in general” or “effective altruists in general”, there would have to be a specific organization, with defined membership and leadership, who would organize the events. Anyone could participate at the events, but that wouldn’t make them equal to the leaders.
For example, for rationalists, CFAR could play this role. You could have thousands of people who identify as “rationalists”, but they would have no impact on the official speeches by CFAR. But CFAR is an organization specialized on teaching rationality; so it would be better to have some other organization to serve as an umbrella organization for the rationalist movement—to contain people who are mutually believed to rationalists, even if they don’t participate at developing a curriculum.
Similarly for effective altruism. You need a network of people who share the values of the movement, who provide “credentials” to each other, and who only accept new people who also credibly demonstrated that they share the values.
You are not wrong of course, but on the scale between “between blindness towards biases, and ignoring human nature” your views fall 80%-90% towards “ignoring human nature”.
Just to give you a more complete image, here’s a thought:
People are not consequentialists, and they don’t know clearly what they want.
In fact, there is nothing “absolute” that tells us what we “should” want.
And the happier you make people, the happier they tend to be with the goals you give them to work on.
If you also teach people rationality, you will get more scrutiny of your goals, but you will never get 100% scrutiny. As a human, you are never allowed to fully know what your goals are.
Looking at the “human nature” side, people who “care deeply” about EA-style things are just people who were in the right situation to “unpack” their motivations in a direction that is more self-consistent than average, not people who fundamentally had different motivations.
So my “human nature” side of this argument says: you can attract people who are in it “just for feeling good”, give them opportunity to grow and unpack their inner motivations, and you’ll end up with people who “care deeply” about your cause.
Yup, agreed on the benefit of expressing strong positive emotions toward rank-and-file EA members, I wrote about this earlier, don’t know if you saw this piece: http://lesswrong.com/lw/n7b/celebrating_all_who_are_in_effective_altruism/
Intentional Insights is actually working actively on a project of making the EA movement more welcoming, let me know if you are interested in collaborating on this project.