I’ve recently drunk the koolaid of the 80,000 Hours initiative and would like to make a difference by dedicating my career to human-aligned AI implementation. I thought I’d comment in this thread to express my interest in information about how I (or anybody else in my position) can make a difference.
A quick search of the forum history shows that a Lurkshop was held in December, which I obviously just missed. But that’s the exact kind of thing I’m looking for.
Any information on how one can enter the field of risk mitigation around AI would be much appreciated. And if this kind of message isn’t welcome on this forum, please let me know.
I didn’t want to dive too deeply into the specifics of where I’m at so that the resulting advice would be generally applicable. But I’ve been out in the working world for more-than-a-few years already and so jumping back into pure research would be a big challenge on a number of levels.
Would you say that the state of the field at the moment is almost entirely research? Or are there opportunities to work in more applied positions in outreach/communications, policy strategy, etc?
Hey all, nice forum you’ve got going here!
I’ve recently drunk the koolaid of the 80,000 Hours initiative and would like to make a difference by dedicating my career to human-aligned AI implementation. I thought I’d comment in this thread to express my interest in information about how I (or anybody else in my position) can make a difference.
A quick search of the forum history shows that a Lurkshop was held in December, which I obviously just missed. But that’s the exact kind of thing I’m looking for.
Any information on how one can enter the field of risk mitigation around AI would be much appreciated. And if this kind of message isn’t welcome on this forum, please let me know.
Welcome to LessWrong!
There are multiple guides to getting started on Alignment. In no particular order:
Alignment Research Field Guide
How To Get Into Independent Research On Alignment/Agency
A Guide to MIRI’s Research
Alignment Fundamentals curriculum (101)
Alignment 201 Curriculum (successor to 101)
I think the last two options are best as comprehensive curricula for a general aspiring alignment researcher.
I applied to take the 101 course myself.
Amazing resources, thank you for sharing!
I didn’t want to dive too deeply into the specifics of where I’m at so that the resulting advice would be generally applicable. But I’ve been out in the working world for more-than-a-few years already and so jumping back into pure research would be a big challenge on a number of levels.
Would you say that the state of the field at the moment is almost entirely research? Or are there opportunities to work in more applied positions in outreach/communications, policy strategy, etc?
No.
There are, I just don’t know them of the top of my head.