Although I think I was mentioning somewhere that a scenario like this would only actually work if we had some AGI that could reliably judge who needed what resources when, in order to further the overall human endeavor.
Wouldn’t the AGI also need the ability to compel obedience to its diktats? Or do you imagine that everyone will do whatever it tells them to do because it must be the best thing to do?
Wouldn’t the AGI also need the ability to compel obedience to its diktats? Or do you imagine that everyone will do whatever it tells them to do because it must be the best thing to do?