I don’t see why personal social goals and general movement goals should be necessarily mutually exclusive.
They are not necessarily, but they are in this case. I think Scott once mentioned that BA rationalists can’t grow beyond about 150. 150 is a magic number, and is suggestive of what the problem might be.
“Cuddle pile” is my slightly unkind shorthand for the kinds of social peculiarities rationalists, imo, should leave behind if they want the ideas to become more mainstream.
Metacomment: “it is not necessarily the case that X” is almost always true for interesting X.
I suspect most rationalists will turn out to care more about their cuddle piles than about their ideas becoming mainstream. There’s always been a rather unhealthy interaction between community goals and the community’s social quirks (we want to raise the sanity waterline → we are saner → our quirks should be evangelized), and we don’t really have a working way to sort out what actually comes with increased rationality and what’s just a founder effect.
I have been trying to serve as a bit of a “loyal opposition” re: separating rationality from social effects. But I am just one dude, and I am biased, too. Plus, I am an outsider, and my opinions don’t really carry a lot of weight outside my area of expertise, around here.
The community itself has to want it, on some level.
sorry, this was an unhelpful comment that is now gone :)
They are not necessarily, but they are in this case. I think Scott once mentioned that BA rationalists can’t grow beyond about 150. 150 is a magic number, and is suggestive of what the problem might be.
“Cuddle pile” is my slightly unkind shorthand for the kinds of social peculiarities rationalists, imo, should leave behind if they want the ideas to become more mainstream.
Metacomment: “it is not necessarily the case that X” is almost always true for interesting X.
I suspect most rationalists will turn out to care more about their cuddle piles than about their ideas becoming mainstream. There’s always been a rather unhealthy interaction between community goals and the community’s social quirks (we want to raise the sanity waterline → we are saner → our quirks should be evangelized), and we don’t really have a working way to sort out what actually comes with increased rationality and what’s just a founder effect.
I agree. And that’s too bad.
I have been trying to serve as a bit of a “loyal opposition” re: separating rationality from social effects. But I am just one dude, and I am biased, too. Plus, I am an outsider, and my opinions don’t really carry a lot of weight outside my area of expertise, around here.
The community itself has to want it, on some level.