I don’t think that AI alignment people doing “enemy of enemy is friend” logic with AI luddites (i.e. people worried about Privacy/Racism/Artists/Misinformation/Jobs/Whatever) is useful.
Alignment research is a luxury good for labs, which means it would be the first thing axed (hyperbolically speaking) if you imposed generic hurdles/costs on their revenue, or if you made them spend on mitigating P/R/A/M/J/W problems.
This “crowding-out” effect is already happening to a very large extent: there are vastly more researchers and capital being devoted to P/R/A/M/J/W problems, which could have been allocated to actual alignment research! If you are forming a “coalition” with these people, you are getting a very shitty deal—they’ve been much more effective at getting their priorities funded than you have been!
If you want them to care about notkilleveryoneism, you have to specifically make it expensive for them to kill everyone, not just untargetedly “oppose” them. E.g. like foom liability.
I don’t think that AI alignment people doing “enemy of enemy is friend” logic with AI luddites (i.e. people worried about Privacy/Racism/Artists/Misinformation/Jobs/Whatever) is useful.
Alignment research is a luxury good for labs, which means it would be the first thing axed (hyperbolically speaking) if you imposed generic hurdles/costs on their revenue, or if you made them spend on mitigating P/R/A/M/J/W problems.
This “crowding-out” effect is already happening to a very large extent: there are vastly more researchers and capital being devoted to P/R/A/M/J/W problems, which could have been allocated to actual alignment research! If you are forming a “coalition” with these people, you are getting a very shitty deal—they’ve been much more effective at getting their priorities funded than you have been!
If you want them to care about notkilleveryoneism, you have to specifically make it expensive for them to kill everyone, not just untargetedly “oppose” them. E.g. like foom liability.
Sounds plausible but do you have any numeric evidence for this?