This was the experience in Vancouver after CFAR workshops, and the atmosphere persisted for a long time. It wasn’t only me who was conflating “[big event] atmosphere” with “Berkeley Rationalist Community”. Not just me but a lot of other people in Vancouver, and also how a lot of rationalists from elsewhere talk about the Berkeley Rationalist Community (I’m going to call it the Bayesian Area), it’s often depicted as super awesome.
The first thing that comes to mind is a lot of rationalists from outside of Berkeley only visit town for events like CFAR workshops, CFAR alumni reunions, EA Global, Burning Man, etc. So if one rationalist visits Berkeley a few times a year and always returns to their home base talking about their experiences in Berkeley right after these exciting events, it makes the Berkeley community itself seem constantly exciting. I’m guessing the reality is Berkeley community isn’t always buzzing with conferences and workshops, and organizing all those things is actually very stressful.
There definitely is a halo around the Berkeley Rationalist Community for other reasons:
It’s often touted ‘leveling up’ to the point one can get hired at an x-risk reduction organization or working on another important project like a startup in Berkeley is an important and desirable thing for rationalists to do.
There’s often a perception resources are only invested in projects based in the Bay Area, so trying to start projects with rationalists elsewhere and expect to sustain them long-term is futile.
Moving to Berkeley is still inaccessible or impractical for a lot of rationalists scattered everywhere that (especially if their friends leave) it breeds a sense of alienation and being left behind/stranded as one watches everyone else talk about how they *can* flock to the Berkeley. Combined with the rest of the above, this can also unfortunately breed feelings of resentment.
Rationalists from outside Berkeley often report feeling as though the benefits or incentives to moving to the Berkeley community are exaggerated relatives to the trade-offs or costs of moving to Berkeley.
It would not surprise me if this halo effect around the Berkeley rationalist community around the world is just a case of confirmation bias writ large among rationalists everywhere. It could be there is a sense the Bayesian Area is doing all this deliberately, when almost no rationalists in Berkeley intended to do this. The accounts of what has happened to the NYC community are pretty startling, especially as one of the healthier communities I thought it would persist. The most I can say is there is a wide variance in accounts of how much a local rationalist community feels or not pressure exerted from Berkeley to send as many people as possible their way.
But then I also read stuff like this post by Alyssa, who is from the Berkeley rationalist community, and Zvi’s comment about Berkeley itself eating the seed corn of Berkeley sounds plausible. Sarah C also wrote this post about how the Bayesian Area has changed over the years. The posts are quite different but the theme of both is the Bayesian Area in reality defies many rationalists’ expectations of what the community is or should be about.
Another thing is much of the recruitment is driven by efforts which are decidedly more ‘effective altruist’ than they are ‘rationalist’. With the Open Philanthropy Project and the effective altruism movement enabling the growth of so many community projects based in the Bay Area, it both i) draws people from outside Bay Area; ii) draws attention to the sorts of projects EA incentivizes at the expense of focusing on other rationalist projects in Berkeley. As far as I can tell, much of the rationality community who don’t consider themselves effective altruists aren’t happy EA eats up such a huge part of the community’s time, attention and money. As far as I can tell, it’s not that they don’t like EA. The major complaint is projects in the community with the EA stamp of approval are magically more deserving of support than other rationalist projects, regardless of arguments weighing the projects against each other.
To me a funny thing is from the other side I’m aware of a lot of effective altruists long focused on global poverty alleviation or other causes are unhappy with a disproportionate diversion of time, attention, money, and talent toward AI alignment, but moreover EA movement-building and other meta-level activities. Both rationalists and effective altruists find projects also receive funding on the basis of fitting frameworks which are ultimately too narrow and limited to account for all the best projects (e.g., the Important/Neglected/Tractable framework). So it appears the most prioritized projects in effective altruism are driving rapid changes that the grassroots elements of both the rationality and EA movements aren’t able to adapt to. A lot of effective altruists and rationalists from outside the Bay Area perceive it as a monolith eating their communities, and a lot of rationalists in Berkeley see the same happening to local friends whose attention used to not be so singularly focused on EA.
This was the experience in Vancouver after CFAR workshops, and the atmosphere persisted for a long time. It wasn’t only me who was conflating “[big event] atmosphere” with “Berkeley Rationalist Community”. Not just me but a lot of other people in Vancouver, and also how a lot of rationalists from elsewhere talk about the Berkeley Rationalist Community (I’m going to call it the Bayesian Area), it’s often depicted as super awesome.
The first thing that comes to mind is a lot of rationalists from outside of Berkeley only visit town for events like CFAR workshops, CFAR alumni reunions, EA Global, Burning Man, etc. So if one rationalist visits Berkeley a few times a year and always returns to their home base talking about their experiences in Berkeley right after these exciting events, it makes the Berkeley community itself seem constantly exciting. I’m guessing the reality is Berkeley community isn’t always buzzing with conferences and workshops, and organizing all those things is actually very stressful.
There definitely is a halo around the Berkeley Rationalist Community for other reasons:
It’s often touted ‘leveling up’ to the point one can get hired at an x-risk reduction organization or working on another important project like a startup in Berkeley is an important and desirable thing for rationalists to do.
There’s often a perception resources are only invested in projects based in the Bay Area, so trying to start projects with rationalists elsewhere and expect to sustain them long-term is futile.
Moving to Berkeley is still inaccessible or impractical for a lot of rationalists scattered everywhere that (especially if their friends leave) it breeds a sense of alienation and being left behind/stranded as one watches everyone else talk about how they *can* flock to the Berkeley. Combined with the rest of the above, this can also unfortunately breed feelings of resentment.
Rationalists from outside Berkeley often report feeling as though the benefits or incentives to moving to the Berkeley community are exaggerated relatives to the trade-offs or costs of moving to Berkeley.
It would not surprise me if this halo effect around the Berkeley rationalist community around the world is just a case of confirmation bias writ large among rationalists everywhere. It could be there is a sense the Bayesian Area is doing all this deliberately, when almost no rationalists in Berkeley intended to do this. The accounts of what has happened to the NYC community are pretty startling, especially as one of the healthier communities I thought it would persist. The most I can say is there is a wide variance in accounts of how much a local rationalist community feels or not pressure exerted from Berkeley to send as many people as possible their way.
But then I also read stuff like this post by Alyssa, who is from the Berkeley rationalist community, and Zvi’s comment about Berkeley itself eating the seed corn of Berkeley sounds plausible. Sarah C also wrote this post about how the Bayesian Area has changed over the years. The posts are quite different but the theme of both is the Bayesian Area in reality defies many rationalists’ expectations of what the community is or should be about.
Another thing is much of the recruitment is driven by efforts which are decidedly more ‘effective altruist’ than they are ‘rationalist’. With the Open Philanthropy Project and the effective altruism movement enabling the growth of so many community projects based in the Bay Area, it both i) draws people from outside Bay Area; ii) draws attention to the sorts of projects EA incentivizes at the expense of focusing on other rationalist projects in Berkeley. As far as I can tell, much of the rationality community who don’t consider themselves effective altruists aren’t happy EA eats up such a huge part of the community’s time, attention and money. As far as I can tell, it’s not that they don’t like EA. The major complaint is projects in the community with the EA stamp of approval are magically more deserving of support than other rationalist projects, regardless of arguments weighing the projects against each other.
To me a funny thing is from the other side I’m aware of a lot of effective altruists long focused on global poverty alleviation or other causes are unhappy with a disproportionate diversion of time, attention, money, and talent toward AI alignment, but moreover EA movement-building and other meta-level activities. Both rationalists and effective altruists find projects also receive funding on the basis of fitting frameworks which are ultimately too narrow and limited to account for all the best projects (e.g., the Important/Neglected/Tractable framework). So it appears the most prioritized projects in effective altruism are driving rapid changes that the grassroots elements of both the rationality and EA movements aren’t able to adapt to. A lot of effective altruists and rationalists from outside the Bay Area perceive it as a monolith eating their communities, and a lot of rationalists in Berkeley see the same happening to local friends whose attention used to not be so singularly focused on EA.