Is this a “failed utopia” because human relationships are too sacred to break up, or is it a “failed utopia” because the AI knows what it should really have done but hasn’t been programmed to do it?
I think it’s a failed utopia because it involves the AI modifying the humans’ desires wholesale—the fact that it does so by proxy doesn’t change that it’s doing that.
(This may not be the only reason it’s a failed utopia.)
Is this a “failed utopia” because human relationships are too sacred to break up, or is it a “failed utopia” because the AI knows what it should really have done but hasn’t been programmed to do it?
I think it’s a failed utopia because it involves the AI modifying the humans’ desires wholesale—the fact that it does so by proxy doesn’t change that it’s doing that.
(This may not be the only reason it’s a failed utopia.)
I don’t see how those are mutually exclusive.