For a really good example of what I would consider a ‘dumb’ way for AGI misalignment to be problematic, I recommend “accelerando” by charles stross. It’s available in text/html form for free from his web site. Even now, after 20 years, it’s still very full of ideas.
(FYI, sections are about ten years apart in the book, but last I read it it seemed like the dates are off by a factor of two or so. Eg. 2010 in the book corresponds loosely to 2020 in real life, 2020 in the book corresponds loosely to 2040, etc.)
In that book, the badness largely comes from increasingly competent / sentient corporate management and legal software.
For a really good example of what I would consider a ‘dumb’ way for AGI misalignment to be problematic, I recommend “accelerando” by charles stross. It’s available in text/html form for free from his web site. Even now, after 20 years, it’s still very full of ideas.
(FYI, sections are about ten years apart in the book, but last I read it it seemed like the dates are off by a factor of two or so. Eg. 2010 in the book corresponds loosely to 2020 in real life, 2020 in the book corresponds loosely to 2040, etc.)
In that book, the badness largely comes from increasingly competent / sentient corporate management and legal software.