Duncan Sabien here, worked at CFAR 2015 to 2018 (currently work at MIRI).
My understanding of the state of CFAR:
“Defunct” is a reasonable description, relative to the org that ran 5-15 workshops per year and popularized TAPs and Goal Factoring and Double Crux and so forth.
Currently, it is not-quite-accurate but closer to true than false to say “CFAR is currently just two people.” The org still has the venue, as far as I know, and it’s still occasionally being used by the EA/rationalist/longtermist communities. The org also still has some amount of funding, which it is using for various individual projects (e.g. I don’t know if former CFAR employees like Dan Keys or Elizabeth Garrett are getting CFAR grants to run their own individual investigations and experiments, but I would not be surprised and also to be clear this would be a pretty appropriate and good thing, according to me).
There are some smaller, quieter, workshop-esque things happening from time to time, but they are more geared toward specific audiences or accomplishing narrow goals, and they are not the generalized “develop the art of human rationality and deliver that art to high-impact people” goal that CFAR used to somewhat fill. As far as I can tell, there’s a decent chance that new ambitious projects might rise from the ashes, so to speak, but they’ll likely be AI-xrisk oriented.
I personally have been wishing for more clarity on all of this, for precisely the reason that I remain interested in people furthering human rationality and would like people to not be thinking that “CFAR is on it” when afaik it has not been on it since some time in 2019.
I’m part of a small group of people who might plausibly launch new projects in that direction, and I myself am currently running one-off workshops-for-hire (am typing this from Day 3 of a workshop in the Bay Area, as it happens) for groups that are capable of pulling together participants, venue, and ops and just need the content/leadership.
Additionally, it would be interesting to hear why the endeavour was abandoned in the end, to avoid going on wild goose-chases oneself (or, in the very boring case, to discover that they ran out of funding (though that appears unlikely to me)).
I certainly cannot speak with any authority or finality for CFAR, having not been there since late 2018/early 2019. But my sense is more like “it was always more about the AI fight than about general rationality, and the org in its evolved state was not serving the AI fight goals particularly well, so it just kinda fizzled and each of its individual members struck out on their own new path.”
Duncan Sabien here, worked at CFAR 2015 to 2018 (currently work at MIRI).
My understanding of the state of CFAR:
“Defunct” is a reasonable description, relative to the org that ran 5-15 workshops per year and popularized TAPs and Goal Factoring and Double Crux and so forth.
Currently, it is not-quite-accurate but closer to true than false to say “CFAR is currently just two people.” The org still has the venue, as far as I know, and it’s still occasionally being used by the EA/rationalist/longtermist communities. The org also still has some amount of funding, which it is using for various individual projects (e.g. I don’t know if former CFAR employees like Dan Keys or Elizabeth Garrett are getting CFAR grants to run their own individual investigations and experiments, but I would not be surprised and also to be clear this would be a pretty appropriate and good thing, according to me).
There are some smaller, quieter, workshop-esque things happening from time to time, but they are more geared toward specific audiences or accomplishing narrow goals, and they are not the generalized “develop the art of human rationality and deliver that art to high-impact people” goal that CFAR used to somewhat fill. As far as I can tell, there’s a decent chance that new ambitious projects might rise from the ashes, so to speak, but they’ll likely be AI-xrisk oriented.
I personally have been wishing for more clarity on all of this, for precisely the reason that I remain interested in people furthering human rationality and would like people to not be thinking that “CFAR is on it” when afaik it has not been on it since some time in 2019.
I’m part of a small group of people who might plausibly launch new projects in that direction, and I myself am currently running one-off workshops-for-hire (am typing this from Day 3 of a workshop in the Bay Area, as it happens) for groups that are capable of pulling together participants, venue, and ops and just need the content/leadership.
I certainly cannot speak with any authority or finality for CFAR, having not been there since late 2018/early 2019. But my sense is more like “it was always more about the AI fight than about general rationality, and the org in its evolved state was not serving the AI fight goals particularly well, so it just kinda fizzled and each of its individual members struck out on their own new path.”