Your fourth virtue of human review, that it’s feasible while humans outnumber and outpower the AI mentions that once there are trillions of human+ AI around, it gets hard to fiscalize them. This seems true not only for humans but also for the superintelligence herself. As has been pointed out, the Singleton would have to be really amazingly powerful it if can control (as it can, by definition) trillions of AIs, better to just keep numbers low, or not?
Your fourth virtue of human review, that it’s feasible while humans outnumber and outpower the AI mentions that once there are trillions of human+ AI around, it gets hard to fiscalize them. This seems true not only for humans but also for the superintelligence herself. As has been pointed out, the Singleton would have to be really amazingly powerful it if can control (as it can, by definition) trillions of AIs, better to just keep numbers low, or not?