Are there any organizations or research groups that are specifically working on improving the effectiveness of the alignment research community? E.g.
Reviewing the literature on intellectual progress, metascience, and social epistemology and applying the resulting insights to this community
Funding the development of experimental “epistemology software”, like Arbital or Mathopedia
The classic one is Lightcone Infrastructure, the team that runs LessWrong and the Alignment Forum.
Are there any organizations or research groups that are specifically working on improving the effectiveness of the alignment research community? E.g.
Reviewing the literature on intellectual progress, metascience, and social epistemology and applying the resulting insights to this community
Funding the development of experimental “epistemology software”, like Arbital or Mathopedia
The classic one is Lightcone Infrastructure, the team that runs LessWrong and the Alignment Forum.