I was talking to a friendly recently who is an experienced software developer looking to get into AI safety. Both of us have been reading LessWrong for a long time, but were unclear on various things. For example, where can you go to see a list of all job and funding opportunities? Would jobs be ok with someone with a software engineering background learning AI related things on the job? Would grants be ok with that? What remote opportunities are available? What if there is a specific type of work you are interested in? What does the pay look like?
These are just a few of the things we were unclear on. And I expect that if you interviewed other people in similar boats, there would be different things that they are unclear on, and that this results in lots of people not entering the field of AI safety who otherwise would. So then, perhaps having some sort of comprehensive career guide would be a high level action that would result in lots more people entering the field.
Edit: As an alternative, we could also have some sort of page with a list of people in the field of AI safety who are willing to chat on the phone with those who are looking to enter the field and answer questions. Now that I think about it, I suspect this would be both a) more effective at “converting” new “leads”, and b) something that those in the field of AI safety would be more willing to do.
Why do I believe (a)? Having a career guide that is comprehensive enough where you get all of your questions addressed is hard. And there’s something about speaking with a real person. Why do I believe (b)? Chatting with people is fun. Especially when you are able to help them. It also is low-commitment and doesn’t take very long. On the other hand, writing and (especially) maintaining a guide is a lot of work.
If you’re in the field of AI safety, it would be awesome if you added your contact info.
If you know someone in the field of AI safety, it would be awesome if you brought this to their attention.
I just threw this together haphazardly. If someone is willing to take over the project and/or make the doc a little nicer, do something better in Notion, or create a real website for this, that would be awesome. I’d pursue this myself if there was enough interest (I’m a programmer and would build a real website de
If you are a LessWrong moderator, it’d be cool if you considered linking to this prominently. I feel like that might be a necessary condition for this succeeding. Otherwise it feels like the sort of thing that would rely on word of mouth to know that it exists, and that it probably wouldn’t spread well enough to survive long enough.
If you are someone looking to get into the field of AI safety research, it would be great if you could share your thoughts and experiences, positive or negative, so we can update our beliefs about what the pain points really are.
Yeah, my experience has been that there’s a lot of posts talking about how AI safety companies want engineers, but it seems like it’s all wanting engineers who live in Berkley, San Francisco, or NYC, or wanting people to retrain as researchers coming at problems from a specific direction. The “How to get into independent research” post is more useful, but assumes you’re financially-independent and/or have an extreme tolerance to risk (step 1: quit your job and self-finance education for a few months). I’m currently in the process of saving up enough money to be able to do this, but it seems like I must not be the only one stuck here.
I was talking to a friendly recently who is an experienced software developer looking to get into AI safety. Both of us have been reading LessWrong for a long time, but were unclear on various things. For example, where can you go to see a list of all job and funding opportunities? Would jobs be ok with someone with a software engineering background learning AI related things on the job? Would grants be ok with that? What remote opportunities are available? What if there is a specific type of work you are interested in? What does the pay look like?
These are just a few of the things we were unclear on. And I expect that if you interviewed other people in similar boats, there would be different things that they are unclear on, and that this results in lots of people not entering the field of AI safety who otherwise would. So then, perhaps having some sort of comprehensive career guide would be a high level action that would result in lots more people entering the field.
Or, perhaps there are good resources available, and I am just unaware of them. Anyone have any tips? I found 80,000 hours’ career review of AI safety technical research and johnswentworth’s post How To Get Into Independent Research On Alignment/Agency, but neither seems comprehensive enough.
Edit: As an alternative, we could also have some sort of page with a list of people in the field of AI safety who are willing to chat on the phone with those who are looking to enter the field and answer questions. Now that I think about it, I suspect this would be both a) more effective at “converting” new “leads”, and b) something that those in the field of AI safety would be more willing to do.
Why do I believe (a)? Having a career guide that is comprehensive enough where you get all of your questions addressed is hard. And there’s something about speaking with a real person. Why do I believe (b)? Chatting with people is fun. Especially when you are able to help them. It also is low-commitment and doesn’t take very long. On the other hand, writing and (especially) maintaining a guide is a lot of work.
So then, here is a Google Doc: https://docs.google.com/document/d/1XlvUpShIuO_kSFu8IFfhX7lKpr10EypDKXGN-CaS7FM/edit?usp=sharing.
If you’re in the field of AI safety, it would be awesome if you added your contact info.
If you know someone in the field of AI safety, it would be awesome if you brought this to their attention.
I just threw this together haphazardly. If someone is willing to take over the project and/or make the doc a little nicer, do something better in Notion, or create a real website for this, that would be awesome. I’d pursue this myself if there was enough interest (I’m a programmer and would build a real website de
If you are a LessWrong moderator, it’d be cool if you considered linking to this prominently. I feel like that might be a necessary condition for this succeeding. Otherwise it feels like the sort of thing that would rely on word of mouth to know that it exists, and that it probably wouldn’t spread well enough to survive long enough.
If you are someone looking to get into the field of AI safety research, it would be great if you could share your thoughts and experiences, positive or negative, so we can update our beliefs about what the pain points really are.
Are you aware of AI Safety Support? You can book a call with Frances Lorenz or JJ.
I am not, it looks awesome, thanks for sharing! I will pass it along to my friend.
Thanks for posting this!
Yeah, my experience has been that there’s a lot of posts talking about how AI safety companies want engineers, but it seems like it’s all wanting engineers who live in Berkley, San Francisco, or NYC, or wanting people to retrain as researchers coming at problems from a specific direction. The “How to get into independent research” post is more useful, but assumes you’re financially-independent and/or have an extreme tolerance to risk (step 1: quit your job and self-finance education for a few months). I’m currently in the process of saving up enough money to be able to do this, but it seems like I must not be the only one stuck here.