Relevant info: I’ve volunteered at 1 CFAR workshop and hang out in the CFAR office periodically. My views here represent my models of how I think CFAR is thinking and are my own.
For 1), you might be interested to know that I recently made a Double Crux UI mockup here. I’m hoping to start some discussion on what an actual interface might look like.
Related to the idea of a prep course, I’ll be making a LW post in the next few days about my attempt to create a new sequence on instrumental rationality that is complementary to the sort of self-improvement CFAR does. That may be of interest to you.
Otherwise, I can say that at least at the workshop I was at, there was zero mention of AI safety from the staff. (You can read my review here). It’s my impression that there’s a lot of cool stuff CFAR could be doing in tandem w/ their workshops, but they’re time constrained. Hopefully this becomes less so w/ their new hires.
I do agree that having additional scaffolding would be very good, and that’s part of my motivation to start on a new sequence.
Happy to talk more on this as I also think this is an important thing to focus on.
For 1), you might be interested to know that I recently made a Double Crux UI mockup here. I’m hoping to start some discussion on what an actual interface might look like.
Yep, you were one of the parties I was thinking of. Nice work! :D
Relevant info: I’ve volunteered at 1 CFAR workshop and hang out in the CFAR office periodically. My views here represent my models of how I think CFAR is thinking and are my own.
For 1), you might be interested to know that I recently made a Double Crux UI mockup here. I’m hoping to start some discussion on what an actual interface might look like.
Related to the idea of a prep course, I’ll be making a LW post in the next few days about my attempt to create a new sequence on instrumental rationality that is complementary to the sort of self-improvement CFAR does. That may be of interest to you.
Otherwise, I can say that at least at the workshop I was at, there was zero mention of AI safety from the staff. (You can read my review here). It’s my impression that there’s a lot of cool stuff CFAR could be doing in tandem w/ their workshops, but they’re time constrained. Hopefully this becomes less so w/ their new hires.
I do agree that having additional scaffolding would be very good, and that’s part of my motivation to start on a new sequence.
Happy to talk more on this as I also think this is an important thing to focus on.
Yep, you were one of the parties I was thinking of. Nice work! :D