Background material recommendations (more in depth): Please recommend your favorite AGI safety background reading / videos / lectures / etc. For this sub-thread more in-depth recommendations are allowed, including material that requires technical expertise of some sort. (Please specify what kind of background knowledge / expertise is required to understand the material you’re recommending.) This is also the place to recommend general resources people can look at if they want to start doing a deeper dive into AGI safety and related topics.
Thanks! I’ve spent a lot of the last year and a half working on the wiki infrastructure, we’re getting pretty close to being ready to launch to editors in a more serious way.
AXRP—Excellent interviews with a variety of researchers. Daniel’s substantial own knowledge means that the questions he asks are often excellent, and the technical depth is far better than anything else that’s available in audio, given the difficulty of autoreaders on papers or the alignment forum finding it difficult to handle actual maths.
Background material recommendations (more in depth): Please recommend your favorite AGI safety background reading / videos / lectures / etc. For this sub-thread more in-depth recommendations are allowed, including material that requires technical expertise of some sort. (Please specify what kind of background knowledge / expertise is required to understand the material you’re recommending.) This is also the place to recommend general resources people can look at if they want to start doing a deeper dive into AGI safety and related topics.
Stampy has the canonical answer to this: I’d like to get deeper into the AI alignment literature. Where should I look?
Feel free to improve the answer, as it’s on a wiki. It will be served via a custom interface once that’s ready (prototype here).
This website looks pretty cool! I didn’t know about this before.
Thanks! I’ve spent a lot of the last year and a half working on the wiki infrastructure, we’re getting pretty close to being ready to launch to editors in a more serious way.
Obligatory link to the excellent AGI Safety Fundamentals curriculum.
AXRP—Excellent interviews with a variety of researchers. Daniel’s substantial own knowledge means that the questions he asks are often excellent, and the technical depth is far better than anything else that’s available in audio, given the difficulty of autoreaders on papers or the alignment forum finding it difficult to handle actual maths.