Others have to decide on branding and identity, but I consider MIRI and the the Future of Humanity as having very different core missions than LessWrong, so adding those missions in a description of LW muddies the presentation, particularly for outreach material.
To the point, I think “we’re saving the world from Unfriendly AI” is not effective general outreach for LessWrong’s core mission, and beating around the bush with “existential threats” would elicit a “Huh? What?” in most readers. And is there really much going on about other existential threats on LW? Nuclear proliferation? Biological warfare? Asteroid collisions?
Preventing existential risk is part of what this site is about.
I don’t think that existential risk really a part of the LessWrong Blog/Forum/Wiki site mission, it’s just one of the particular areas of interest of many here, like effective altruism, or Reactionary politics.
CFAR makes a good institutional match to the Blog/Forum/Wiki of LessWrong, with the mission of developing, delivering, and testing? training to the end of becoming LessWrong, focusing on the same subject areas and influences as LessWrong itself.
Others have to decide on branding and identity, but I consider MIRI and the the Future of Humanity as having very different core missions than LessWrong, so adding those missions in a description of LW muddies the presentation, particularly for outreach material.
To the point, I think “we’re saving the world from Unfriendly AI” is not effective general outreach for LessWrong’s core mission, and beating around the bush with “existential threats” would elicit a “Huh? What?” in most readers. And is there really much going on about other existential threats on LW? Nuclear proliferation? Biological warfare? Asteroid collisions?
I don’t think that existential risk really a part of the LessWrong Blog/Forum/Wiki site mission, it’s just one of the particular areas of interest of many here, like effective altruism, or Reactionary politics.
CFAR makes a good institutional match to the Blog/Forum/Wiki of LessWrong, with the mission of developing, delivering, and testing? training to the end of becoming LessWrong, focusing on the same subject areas and influences as LessWrong itself.