I think that a 95% success rate in not destroying the human world would also be fantastic, though I note that there are plenty more potential totalitarian hellscapes that some people would apparently rate even worse than extinction.
Note that I’m not saying that they would deliberately destroy the world for shits and giggles, just that if the rest of the human world was any impediment to anything they valued more, then its destruction would just be a side effect of what had to be done.
I also don’t have any illusion that a superintelligent agent will be infallible. The laws of the universe are not kind, and great power brings the opportunity for causing great disasters. I fully expect that any super-civilizational entity of any level of intelligence could very well destroy the human world by mistake.
I think that a 95% success rate in not destroying the human world would also be fantastic, though I note that there are plenty more potential totalitarian hellscapes that some people would apparently rate even worse than extinction.
Note that I’m not saying that they would deliberately destroy the world for shits and giggles, just that if the rest of the human world was any impediment to anything they valued more, then its destruction would just be a side effect of what had to be done.
I also don’t have any illusion that a superintelligent agent will be infallible. The laws of the universe are not kind, and great power brings the opportunity for causing great disasters. I fully expect that any super-civilizational entity of any level of intelligence could very well destroy the human world by mistake.