It was a platform MIRI built for discussing their research, that required an invite to post/comment. There’s lots of really interesting stuff there—I remember enjoying reading Jessica Taylor’s many posts summarising intuitions behind different research agendas.
It was a bit hard and confusing to use, and noticing that it seemed like we might be able to do better was one of the things that caused us to come up with building the AI Alignment Forum.
As the new forum is a space for discussion of all alignment research, and all of the old IAFF stuff is subset of that, we (with MIRI’s blessing) imported all the old content. At some point we’ll set all the old links to redirect to the AI Alignment Forum too.
agentfoundations.org—lots of good stuff there, but most of it gets very few responses. The recently launched alignmentforum.org is an attempt to do the same thing, but with crossposting to LW.
I’ve never even heard of IAFF! What is that?
Edit: oops, cousin_it beat me to it.
The “Intelligent Agent Foundations Forum” at https://agentfoundations.org/.
It was a platform MIRI built for discussing their research, that required an invite to post/comment. There’s lots of really interesting stuff there—I remember enjoying reading Jessica Taylor’s many posts summarising intuitions behind different research agendas.
It was a bit hard and confusing to use, and noticing that it seemed like we might be able to do better was one of the things that caused us to come up with building the AI Alignment Forum.
As the new forum is a space for discussion of all alignment research, and all of the old IAFF stuff is subset of that, we (with MIRI’s blessing) imported all the old content. At some point we’ll set all the old links to redirect to the AI Alignment Forum too.
agentfoundations.org—lots of good stuff there, but most of it gets very few responses. The recently launched alignmentforum.org is an attempt to do the same thing, but with crossposting to LW.