Why do some people become so enamored with the singleton scenario that they can’t settle for anything less? What’s wrong with humans using “smart enough” machines to solve world hunger and such, working out any ethical issues along the way, instead of delegating the whole task to one big AI?
It’s potentially dangerous, given the uncertainty about what exactly you are talking about. If it’s not dangerous, go for it.
Settling for something less than a singleton won’t solve the problem of human-indifferent intelligence explosion.
If you think you need the singleton to protect you from some danger, what can be more dangerous than a singleton?
Another singleton, which is part of the danger in question.
It’s potentially dangerous, given the uncertainty about what exactly you are talking about. If it’s not dangerous, go for it.
Settling for something less than a singleton won’t solve the problem of human-indifferent intelligence explosion.
Another singleton, which is part of the danger in question.