I actually think that the question of DSA is maybe taking up too much oxygen in this debate. Usually this debate is had with soft-Singularitarians. It’s a bit rich for someone to claim that AGI is going to be awesome and then deny that it will be very powerful. Arguing about DSA is avoiding the really uncomfortable questions, tbh. (The third position in this debate, the “AGI never/not soon” types, would not be easily convinced of near-future DSA, but they also don’t really matter for policy outcomes.)
There’s another aspect of the debate which I think we need to pivot to, sooner rather than later. That is the question of why AI would be misaligned and incorrigable. To be quite honest, I still don’t alieve alignment as all that difficult or unnatural, and I haven’t been well persuaded by the arguments I’ve seen. I’m saying this because I expect that it’s even harder to be convinced that alignment is hard if your paycheck is involved. Therefore, if we want to change the minds of Sam Altman and Demis Hassibis, I expect that hardness of alignment is a more difficult sell than DSA, which Sam and Demis probably already would like to believe for income/ego reasons.
I actually think that the question of DSA is maybe taking up too much oxygen in this debate. Usually this debate is had with soft-Singularitarians. It’s a bit rich for someone to claim that AGI is going to be awesome and then deny that it will be very powerful. Arguing about DSA is avoiding the really uncomfortable questions, tbh. (The third position in this debate, the “AGI never/not soon” types, would not be easily convinced of near-future DSA, but they also don’t really matter for policy outcomes.)
There’s another aspect of the debate which I think we need to pivot to, sooner rather than later. That is the question of why AI would be misaligned and incorrigable. To be quite honest, I still don’t alieve alignment as all that difficult or unnatural, and I haven’t been well persuaded by the arguments I’ve seen. I’m saying this because I expect that it’s even harder to be convinced that alignment is hard if your paycheck is involved. Therefore, if we want to change the minds of Sam Altman and Demis Hassibis, I expect that hardness of alignment is a more difficult sell than DSA, which Sam and Demis probably already would like to believe for income/ego reasons.