These monthly threads and Stampy sound like they’ll be great resources for learning about alignment research.
I’d like to know about as many resources as possible for supporting and guiding my own alignment research self-study process. (And by resources, I guess I don’t just mean more stuff to read; I mean organizations or individuals you can talk to for guidance on how to move forward in one’s self-education).
Could someone provide a link to a page that attempts to gather links to all such resources in one place? I already saw the Stampy answer to “Where Can I Learn About AI Alignment?”. Is that pretty comprehensive, or are there many more resources?
We’re working on a How can I help tree of questions and answers, which will include more info on who to talk to, but for now I’ll suggest AI Safety Support and 80k.
These monthly threads and Stampy sound like they’ll be great resources for learning about alignment research.
I’d like to know about as many resources as possible for supporting and guiding my own alignment research self-study process. (And by resources, I guess I don’t just mean more stuff to read; I mean organizations or individuals you can talk to for guidance on how to move forward in one’s self-education).
Could someone provide a link to a page that attempts to gather links to all such resources in one place?
I already saw the Stampy answer to “Where Can I Learn About AI Alignment?”. Is that pretty comprehensive, or are there many more resources?
Stampy has some of this, over at What are some good resources on AI alignment?
We’re working on a How can I help tree of questions and answers, which will include more info on who to talk to, but for now I’ll suggest AI Safety Support and 80k.
Thanks, that helps!