I’m not convinced of your second point. Powerful MoR wizards aren’t nearly as constrained as people in our world, but the only person in the story that’s explicitly trying to act as a optimizer over a generalized utility function rather than to achieve some set of concrete goals—that is, the only person acting as Eliezer’s concept of AI would—is Harry himself. There are a number of ways that plot thread could be resolved, but destroying the world as we know it isn’t the one I’d bet on—although I’d expect the threat of such destruction to come into play at some point down the road.
We’ve also seen Eliezer’s take on the Dark Lord-as-optimizer concept play itself out once before, in “The Sword of Good”, and that—avoiding spoilers—didn’t seem to resolve in a way consistent with your premise.
I’m not convinced of your second point. Powerful MoR wizards aren’t nearly as constrained as people in our world, but the only person in the story that’s explicitly trying to act as a optimizer over a generalized utility function rather than to achieve some set of concrete goals—that is, the only person acting as Eliezer’s concept of AI would—is Harry himself. There are a number of ways that plot thread could be resolved, but destroying the world as we know it isn’t the one I’d bet on—although I’d expect the threat of such destruction to come into play at some point down the road.
We’ve also seen Eliezer’s take on the Dark Lord-as-optimizer concept play itself out once before, in “The Sword of Good”, and that—avoiding spoilers—didn’t seem to resolve in a way consistent with your premise.