It presupposes a weird and unexplained situation in which AGIs are so efficient that they can subjugate humanity, and yet so costly that they can’t convert energy (for the care and feeding of humans) into work more efficiently by building robots than through human workers.
they can’t convert energy (for the care and feeding of humans) into work more efficiently by building robots than through human workers.
The initial idea was that humans are essentially self-sustaining, and the AI would take over the natural environment with humans just like humans did so for the natural environment without humans.
1,2: suppose it is going into space, to eat Jupiter which has higher density and allows for less speed of light lag. It needs humans until established at Jupiter, after which it doesn’t care.
The goal a self improving system has may be something along the lines of ‘get smarter’, and various psychopathic entities commonly discussed here don’t look like something that would work well as distributed system with big lags.
It doesn’t answer Kaj’s question.
It presupposes a weird and unexplained situation in which AGIs are so efficient that they can subjugate humanity, and yet so costly that they can’t convert energy (for the care and feeding of humans) into work more efficiently by building robots than through human workers.
The initial idea was that humans are essentially self-sustaining, and the AI would take over the natural environment with humans just like humans did so for the natural environment without humans.
1,2: suppose it is going into space, to eat Jupiter which has higher density and allows for less speed of light lag. It needs humans until established at Jupiter, after which it doesn’t care.
The goal a self improving system has may be something along the lines of ‘get smarter’, and various psychopathic entities commonly discussed here don’t look like something that would work well as distributed system with big lags.