Rather than needing us, it might be so that self-improvement is costly, and since humans can understand complex commands and can be infected with memes rather easily, an AI may just start a religion or somesuch to turn us all into its willing slaves.
Edit: Why is this being downvoted? I am not saying this would be a good outcome! People who want to “coexist” with superintelligences ought to learn that this could be and probably would be worse than outright destruction.
Edit: Well said, Carl. I thought of it in response to another comment and couldn’t get it out of my head.
It presupposes a weird and unexplained situation in which AGIs are so efficient that they can subjugate humanity, and yet so costly that they can’t convert energy (for the care and feeding of humans) into work more efficiently by building robots than through human workers.
they can’t convert energy (for the care and feeding of humans) into work more efficiently by building robots than through human workers.
The initial idea was that humans are essentially self-sustaining, and the AI would take over the natural environment with humans just like humans did so for the natural environment without humans.
1,2: suppose it is going into space, to eat Jupiter which has higher density and allows for less speed of light lag. It needs humans until established at Jupiter, after which it doesn’t care.
The goal a self improving system has may be something along the lines of ‘get smarter’, and various psychopathic entities commonly discussed here don’t look like something that would work well as distributed system with big lags.
Rather than needing us, it might be so that self-improvement is costly, and since humans can understand complex commands and can be infected with memes rather easily, an AI may just start a religion or somesuch to turn us all into its willing slaves.
Edit: Why is this being downvoted? I am not saying this would be a good outcome! People who want to “coexist” with superintelligences ought to learn that this could be and probably would be worse than outright destruction.
Edit: Well said, Carl. I thought of it in response to another comment and couldn’t get it out of my head.
It doesn’t answer Kaj’s question.
It presupposes a weird and unexplained situation in which AGIs are so efficient that they can subjugate humanity, and yet so costly that they can’t convert energy (for the care and feeding of humans) into work more efficiently by building robots than through human workers.
The initial idea was that humans are essentially self-sustaining, and the AI would take over the natural environment with humans just like humans did so for the natural environment without humans.
1,2: suppose it is going into space, to eat Jupiter which has higher density and allows for less speed of light lag. It needs humans until established at Jupiter, after which it doesn’t care.
The goal a self improving system has may be something along the lines of ‘get smarter’, and various psychopathic entities commonly discussed here don’t look like something that would work well as distributed system with big lags.