There are a lot of detailed arguments for doom by misaligned AGI.
Coming to grips with them, and the conterarguments in actual proposals for aligning AGI and managing the political and economic fallout, is a herculean task. I feel it’s taken me about two years of spending the majority of my work time on doing that to even have my head mostly around most of the relevant arguments. Having done that, my p(doom) is still roughly 50%, with wide uncertainty for unknown unknows still to be revealed or identified.
So if someone isn’t going to do that, I think the above summary is pretty accurate. Alignment and managing the resulting shifts in the world is not easy, but it’s not impossible. Sometimes humans do amazing things. Sometimes they do amazingly stupid things. So again, roughly 50% from this much rougher method.
There are a lot of detailed arguments for doom by misaligned AGI.
Coming to grips with them, and the conterarguments in actual proposals for aligning AGI and managing the political and economic fallout, is a herculean task. I feel it’s taken me about two years of spending the majority of my work time on doing that to even have my head mostly around most of the relevant arguments. Having done that, my p(doom) is still roughly 50%, with wide uncertainty for unknown unknows still to be revealed or identified.
So if someone isn’t going to do that, I think the above summary is pretty accurate. Alignment and managing the resulting shifts in the world is not easy, but it’s not impossible. Sometimes humans do amazing things. Sometimes they do amazingly stupid things. So again, roughly 50% from this much rougher method.