I guess it’s not easy to find a balance between manipulation, and the “reverse manipulation” where people sabotage themselves to signal that they are not manipulating; between focusing on impressions instead of substance, and not being aware of the impressions; between blindness towards biases, and ignoring human nature. Especially in a group where different people will have wildly different expectations for what is acceptable and what is cultish.
Sometimes it feels like a choice between losing rationality and losing momentum. Optimizing to never do anything stupid can make one never get anything done. (And there is of course the opposite risk, but that is much less likely among our kind, although we obsess about it much more.)
What you described here seems like a solid strategy for getting new members. But then the question is how to prevent diluting the original goals of the group. I mean, instead of people who care deeply about X, you succeed to recruit many random strangers because they will feel good at your group. So how do you make sure that the group as a whole (now having a majority of the people who came for the good feelings, not for X) will continue to focus on X, instead of just making their members feel good?
I think the usual answer is strict hierarchy: the people at the top of the organization, who decided that X is the official goal, will remain at the top; there is no democracy, so even if most people actually care about feeling good, they are told by the bosses to do X as a condition for their staying at the group where they feel good. And only carefully vetted new members, who really contribute to X, are later added to the elite.
So, if rationalists or effective altruists would embrace this strategy, they would need to have a hierarchy, instead of being just a disorganized mob. So instead of “rationalists in general” or “effective altruists in general”, there would have to be a specific organization, with defined membership and leadership, who would organize the events. Anyone could participate at the events, but that wouldn’t make them equal to the leaders.
For example, for rationalists, CFAR could play this role. You could have thousands of people who identify as “rationalists”, but they would have no impact on the official speeches by CFAR. But CFAR is an organization specialized on teaching rationality; so it would be better to have some other organization to serve as an umbrella organization for the rationalist movement—to contain people who are mutually believed to rationalists, even if they don’t participate at developing a curriculum.
Similarly for effective altruism. You need a network of people who share the values of the movement, who provide “credentials” to each other, and who only accept new people who also credibly demonstrated that they share the values.
You are not wrong of course, but on the scale between “between blindness towards biases, and ignoring human nature” your views fall 80%-90% towards “ignoring human nature”.
Just to give you a more complete image, here’s a thought:
People are not consequentialists, and they don’t know clearly what they want.
In fact, there is nothing “absolute” that tells us what we “should” want.
And the happier you make people, the happier they tend to be with the goals you give them to work on.
If you also teach people rationality, you will get more scrutiny of your goals, but you will never get 100% scrutiny. As a human, you are never allowed to fully know what your goals are.
Looking at the “human nature” side, people who “care deeply” about EA-style things are just people who were in the right situation to “unpack” their motivations in a direction that is more self-consistent than average, not people who fundamentally had different motivations.
So my “human nature” side of this argument says: you can attract people who are in it “just for feeling good”, give them opportunity to grow and unpack their inner motivations, and you’ll end up with people who “care deeply” about your cause.
I guess it’s not easy to find a balance between manipulation, and the “reverse manipulation” where people sabotage themselves to signal that they are not manipulating; between focusing on impressions instead of substance, and not being aware of the impressions; between blindness towards biases, and ignoring human nature. Especially in a group where different people will have wildly different expectations for what is acceptable and what is cultish.
Sometimes it feels like a choice between losing rationality and losing momentum. Optimizing to never do anything stupid can make one never get anything done. (And there is of course the opposite risk, but that is much less likely among our kind, although we obsess about it much more.)
What you described here seems like a solid strategy for getting new members. But then the question is how to prevent diluting the original goals of the group. I mean, instead of people who care deeply about X, you succeed to recruit many random strangers because they will feel good at your group. So how do you make sure that the group as a whole (now having a majority of the people who came for the good feelings, not for X) will continue to focus on X, instead of just making their members feel good?
I think the usual answer is strict hierarchy: the people at the top of the organization, who decided that X is the official goal, will remain at the top; there is no democracy, so even if most people actually care about feeling good, they are told by the bosses to do X as a condition for their staying at the group where they feel good. And only carefully vetted new members, who really contribute to X, are later added to the elite.
So, if rationalists or effective altruists would embrace this strategy, they would need to have a hierarchy, instead of being just a disorganized mob. So instead of “rationalists in general” or “effective altruists in general”, there would have to be a specific organization, with defined membership and leadership, who would organize the events. Anyone could participate at the events, but that wouldn’t make them equal to the leaders.
For example, for rationalists, CFAR could play this role. You could have thousands of people who identify as “rationalists”, but they would have no impact on the official speeches by CFAR. But CFAR is an organization specialized on teaching rationality; so it would be better to have some other organization to serve as an umbrella organization for the rationalist movement—to contain people who are mutually believed to rationalists, even if they don’t participate at developing a curriculum.
Similarly for effective altruism. You need a network of people who share the values of the movement, who provide “credentials” to each other, and who only accept new people who also credibly demonstrated that they share the values.
You are not wrong of course, but on the scale between “between blindness towards biases, and ignoring human nature” your views fall 80%-90% towards “ignoring human nature”.
Just to give you a more complete image, here’s a thought:
People are not consequentialists, and they don’t know clearly what they want.
In fact, there is nothing “absolute” that tells us what we “should” want.
And the happier you make people, the happier they tend to be with the goals you give them to work on.
If you also teach people rationality, you will get more scrutiny of your goals, but you will never get 100% scrutiny. As a human, you are never allowed to fully know what your goals are.
Looking at the “human nature” side, people who “care deeply” about EA-style things are just people who were in the right situation to “unpack” their motivations in a direction that is more self-consistent than average, not people who fundamentally had different motivations.
So my “human nature” side of this argument says: you can attract people who are in it “just for feeling good”, give them opportunity to grow and unpack their inner motivations, and you’ll end up with people who “care deeply” about your cause.