I found this document kind of interesting, but it felt less like what I normally understand as a mission statement, and more like “Anna’s thoughts on CFAR’s identity”. I think there’s a place for the latter, but I’d be really interested in seeing (a concise version of) the former, too.
If I had to guess right now I’d expect it to say something like:
We want to develop a community with high epistemic standards and good rationality tools, at least part of which is devoted to reducing existential risk from AI.
… but I kind of expect you to think I have the emphasis there wrong in some way.
I found this document kind of interesting, but it felt less like what I normally understand as a mission statement, and more like “Anna’s thoughts on CFAR’s identity”. I think there’s a place for the latter, but I’d be really interested in seeing (a concise version of) the former, too.
If I had to guess right now I’d expect it to say something like:
… but I kind of expect you to think I have the emphasis there wrong in some way.