Yes, that’s my point. I’m not aware of a path to meaningful contribution to the field that doesn’t involve either doing research or doing support work for a research group. Neither is accessible to me without risking the aforementioned effects.
Yeah right. It does seem like work in alignment at the moment is largely about research, and so a lot of the options come down to doing or supporting research.
I would just note that there is this relatively huge amount of funding in the space at the moment—OpenPhil and FTX both open to injecting huge amounts of funding and largely not having enough places to put it. It’s not that it’s easy to get funded—I wouldn’t say it’s easy at all—but it does really seems like the basic conditions in the space are such that one would expect to find a lot of opportunities to be funded to do good work.
one would expect to find a lot of opportunities to be funded to do good work.
This reader is a software engineer with over a decade of experience. I’m paid handsomely and live in a remote rural area. I am married with three kids. The idea that my specialized experience of building SaaS products in Scala would somehow port over to AI research seems ludicrous. I am certain I’m cognitively capable enough to contribute to AI research, but I’d be leaving a career where I’m compensated based on my experience for one where I’m starting over anew.
Surely OpenPhil and FTX would not match my current salary in order to start my career over, all while allowing me to remain in my current geography (instead of uprooting my kids from friends and school)? It seems unlikely I’d have such a significant leg up over a recent college graduate with a decent GPA so as to warrant matching my software engineering salary.
I’ll say one thing. I too do not like the AI doomtide/doomerism, despite thinking it’s a real problem. You can take breaks from LW or hide posts for AI from your frontpage if you’re upset.
Yes, that’s my point. I’m not aware of a path to meaningful contribution to the field that doesn’t involve either doing research or doing support work for a research group. Neither is accessible to me without risking the aforementioned effects.
Yeah right. It does seem like work in alignment at the moment is largely about research, and so a lot of the options come down to doing or supporting research.
I would just note that there is this relatively huge amount of funding in the space at the moment—OpenPhil and FTX both open to injecting huge amounts of funding and largely not having enough places to put it. It’s not that it’s easy to get funded—I wouldn’t say it’s easy at all—but it does really seems like the basic conditions in the space are such that one would expect to find a lot of opportunities to be funded to do good work.
This reader is a software engineer with over a decade of experience. I’m paid handsomely and live in a remote rural area. I am married with three kids. The idea that my specialized experience of building SaaS products in Scala would somehow port over to AI research seems ludicrous. I am certain I’m cognitively capable enough to contribute to AI research, but I’d be leaving a career where I’m compensated based on my experience for one where I’m starting over anew.
Surely OpenPhil and FTX would not match my current salary in order to start my career over, all while allowing me to remain in my current geography (instead of uprooting my kids from friends and school)? It seems unlikely I’d have such a significant leg up over a recent college graduate with a decent GPA so as to warrant matching my software engineering salary.
Right—you probably could contribute to AI alignment, but your skills mostly wouldn’t port over, and you’d very likely earn less than your current job.
I’ll say one thing. I too do not like the AI doomtide/doomerism, despite thinking it’s a real problem. You can take breaks from LW or hide posts for AI from your frontpage if you’re upset.