On a related, but somewhat different issue: I feel that there has been something of an under-investment in rationality community building overall. EA has CEA, but rationality doesn’t have an equivalent (CFAR doesn’t play the same community building role). There isn’t any organisation responsible for growing the community, organising conferences and addressing challenges that arrive.
That said, I’m not sure that there is necessarily agreement that there is a single mission. Some people are in rationality for ai, some insight porn, some for the personal development and some simply for social reasons. Even though EA has a massively broad goal, doing the most good seems to suffice to spur action in a way that rationality hasn’t.
Generally agreed that there’s not enough funding and otherwise investment in rationality community building.
I deliberately defined the Mission fairly broadly – I think there’s a sense in which anyone who’s committed to making a dent in the universe, who is also dedicated to thinking clearly about it, while subscribing to reasonable cooperation norms, is (or could be) on the same team.
(As noted elsethread, my current best guess it that the village should focus on truthseeking, and the mission is basically truthseeking + impact, with an abstraction one-level higher than Effective Altruism. i.e. the mission includes EA, and includes at least some other things, but I’m less confident I can clearly articulate what they should be)
I like your definition of the mission—I haven’t heard it described in that way / that degree of detail before, and I tend to agree with it. I’m not sure how universally agreed it is, but I would certainly advocate for your vision of it.
On a related, but somewhat different issue: I feel that there has been something of an under-investment in rationality community building overall. EA has CEA, but rationality doesn’t have an equivalent (CFAR doesn’t play the same community building role). There isn’t any organisation responsible for growing the community, organising conferences and addressing challenges that arrive.
That said, I’m not sure that there is necessarily agreement that there is a single mission. Some people are in rationality for ai, some insight porn, some for the personal development and some simply for social reasons. Even though EA has a massively broad goal, doing the most good seems to suffice to spur action in a way that rationality hasn’t.
Generally agreed that there’s not enough funding and otherwise investment in rationality community building.
I deliberately defined the Mission fairly broadly – I think there’s a sense in which anyone who’s committed to making a dent in the universe, who is also dedicated to thinking clearly about it, while subscribing to reasonable cooperation norms, is (or could be) on the same team.
(As noted elsethread, my current best guess it that the village should focus on truthseeking, and the mission is basically truthseeking + impact, with an abstraction one-level higher than Effective Altruism. i.e. the mission includes EA, and includes at least some other things, but I’m less confident I can clearly articulate what they should be)
I like your definition of the mission—I haven’t heard it described in that way / that degree of detail before, and I tend to agree with it. I’m not sure how universally agreed it is, but I would certainly advocate for your vision of it.