I’m not EA (though I do agree with most of the motte—I care about other humans, and I try to be effective), and not part of the rationalist “community”, so take this as an outside view.
There’s a ton of “standard human social drama” in EA and in rationalist communities, and really anywhere where “work” and “regular life” overlap significantly. Some of this takes the form of noticing flaws in other people’s rationality (or, just as often, flaws in kindness/empathy being justified by rationality).
Especially when one doesn’t want to identify and address specific examples, I think there’s a very high risk of misidentifying the cause of a disagreement or disrespect-of-behavior. In this case, I don’t notice much of the flagellation or wishing—either I don’t hang out in the right places, or I bounce off those posts and don’t pay much mind. But things that might fit that pattern strike me as a failure of personal responsibility, not a failure of modeling wishes. Your term self-flagellation is interesting from that standpoint—the historic practice was for penance of generalized sin, and to share suffering, not as a direct correction for anything. It’s clearly social, not rational.
IMO, rationalism must first and foremost be individual. I am trying to be less wrong in my private beliefs and in my goal-directed behaviors. Group rationality is a category error—I don’t have access to group beliefs (if there is such a thing). I do have some influence over group behaviors and shared statements, but I recognize that they are ALWAYS a compromise and negotiated results of individual beliefs and behaviors, and don’t necessarily match any individual in the group.
I’m surprised every time I see a rationalist assuming otherwise, and being disappointed that other members of the group doesn’t share all their beliefs and motivations.
From my perspective, “group rationality” means how much the group provides an environment conductive to becoming more rational (for those who have the propensity to do so).
It is probably easier to describe the opposite—a group where religion and woo are high status, skepticism is considered a form of stupidity, members are encouraged to think in slogans rather than inspect the details of their actual experience, etc.
A rational group would then be one where as an individual rationalist you can do the right thing without getting socially punished in turn, and are gently called out when you do something stupid.
And it is a “more or less” thing, rather than “yes or no”. (However, the same is true about individual rationality.) I would not expect any group to share all my beliefs and motivation. But the difference between sharing 20% or 80% of the beliefs and motivations means a lot to me.
I’m not EA (though I do agree with most of the motte—I care about other humans, and I try to be effective), and not part of the rationalist “community”, so take this as an outside view.
There’s a ton of “standard human social drama” in EA and in rationalist communities, and really anywhere where “work” and “regular life” overlap significantly. Some of this takes the form of noticing flaws in other people’s rationality (or, just as often, flaws in kindness/empathy being justified by rationality).
Especially when one doesn’t want to identify and address specific examples, I think there’s a very high risk of misidentifying the cause of a disagreement or disrespect-of-behavior. In this case, I don’t notice much of the flagellation or wishing—either I don’t hang out in the right places, or I bounce off those posts and don’t pay much mind. But things that might fit that pattern strike me as a failure of personal responsibility, not a failure of modeling wishes. Your term self-flagellation is interesting from that standpoint—the historic practice was for penance of generalized sin, and to share suffering, not as a direct correction for anything. It’s clearly social, not rational.
IMO, rationalism must first and foremost be individual. I am trying to be less wrong in my private beliefs and in my goal-directed behaviors. Group rationality is a category error—I don’t have access to group beliefs (if there is such a thing). I do have some influence over group behaviors and shared statements, but I recognize that they are ALWAYS a compromise and negotiated results of individual beliefs and behaviors, and don’t necessarily match any individual in the group.
I’m surprised every time I see a rationalist assuming otherwise, and being disappointed that other members of the group doesn’t share all their beliefs and motivations.
From my perspective, “group rationality” means how much the group provides an environment conductive to becoming more rational (for those who have the propensity to do so).
It is probably easier to describe the opposite—a group where religion and woo are high status, skepticism is considered a form of stupidity, members are encouraged to think in slogans rather than inspect the details of their actual experience, etc.
A rational group would then be one where as an individual rationalist you can do the right thing without getting socially punished in turn, and are gently called out when you do something stupid.
And it is a “more or less” thing, rather than “yes or no”. (However, the same is true about individual rationality.) I would not expect any group to share all my beliefs and motivation. But the difference between sharing 20% or 80% of the beliefs and motivations means a lot to me.