This critique seems to rely on a misreading of the post. The author isn’t saying the rationality community has exceptionally toxic social norms.
I’m not mentioning these communities because I think they’re extra toxic or anything, by the way. They’re probably less toxic than the average group, and a lot of their principles are great.
Rather that goals, even worthy goals, can result in certain toxic social dynamics that no one would endorse explicitly:
Sometimes—often—these forbidden thoughts/actions aren’t even contrary to the explicit values. They just don’t fit in with the implied group aesthetic, which is often a much stricter, more menacing guideline, all the more so because it’s a collective unwritten fiction.
There’s a bit of an aesthetic parallel to ai alignment. It would be surprising if the poorly understood process that produces social dynamics just so happened to be healthy for everyone involved in the case of the rationality project. Verbalizing some of the implicit beliefs gives people the ability to reflect on which ones they want to keep.
I would expect the author to agree that most (all?) communities contain toxic dynamics.
This critique seems to rely on a misreading of the post. The author isn’t saying the rationality community has exceptionally toxic social norms.
Rather that goals, even worthy goals, can result in certain toxic social dynamics that no one would endorse explicitly:
There’s a bit of an aesthetic parallel to ai alignment. It would be surprising if the poorly understood process that produces social dynamics just so happened to be healthy for everyone involved in the case of the rationality project. Verbalizing some of the implicit beliefs gives people the ability to reflect on which ones they want to keep.
I would expect the author to agree that most (all?) communities contain toxic dynamics.