I honestly feel like some software devs should probably still keep their high-paying jobs instead of going into alignment and just donate a bit of time and programming expertise to help independent researchers if they want to start contributing to AI Safety.
I think we can probably come up with engineering projects that are interesting and low-barrier-to-entry for software engineers.
I also think providing “programming coaching” to some independent researchers could be quite useful. Whether that’s for getting them better at coding up projects efficiently or preparing for research engineer type roles at alignment orgs.
With respect to your engineering skills, I’m going to start to work on tools that are explicitly designed for alignment researchers (https://www.lesswrong.com/posts/a2io2mcxTWS4mxodF/results-from-a-survey-on-tool-use-and-workflows-in-alignment) and having designers and programmers (web devs) would probably be highly beneficial. Unfortunately, I only have funding for myself for the time being. But it would be great to have some people who want to contribute. I’d consider doing AI Safety mentorship as a work trade.
and here (post about gathering data for alignment):
Heads up, we are starting to work on stuff like this in a discord server (DM for link) and I’ll be working on this stuff full-time from February to end of April (if not longer). We’ve talked about data collection a bit over the past year, but have yet to take the time to do anything serious (besides the alignment text dataset). In order to make this work, we’ll have to make it insanely easy on the part of the people generating the data. It’s just not going to happen by default. Some people might take the time to set this up for themselves, but very few do.
Glad to see others take interest in this idea! I think this kind of stuff has a very low barrier to entry for software engineers who want to contribute to alignment, but might want to focus on using their software engineering skills rather than trying to become a full-on researcher. It opens up the door for engineering work that is useful for independent researchers, not just the orgs.
I honestly feel like some software devs should probably still keep their high-paying jobs instead of going into alignment and just donate a bit of time and programming expertise to help independent researchers if they want to start contributing to AI Safety.
I think we can probably come up with engineering projects that are interesting and low-barrier-to-entry for software engineers.
I also think providing “programming coaching” to some independent researchers could be quite useful. Whether that’s for getting them better at coding up projects efficiently or preparing for research engineer type roles at alignment orgs.
I talk a bit more about this, here:
and here (post about gathering data for alignment):