It’s essentially for the same reason that Hollywood thinks aliens will necessarily be hostile. :-)
For the sake of argument, let’s treat AGI as a newly arrived intelligent species. It thinks differently from us, and has different values. Historically, whenever there has been a large power differential between a native species and a new arrival, it has ended poorly for the native species. Historical examples are: the genocide of Native Americans (same species, but less advanced technology), and the wholesale obliteration of 90% of all non-human life on this planet.
That being said, there is room for a symbiotic relationship. AGI will initially depend on factories and electricity produced by human labor, and thus will necessarily be dependent on humans at first. How long this period will last is unclear, but it could settle into a stable equilibrium. After all, humans are moderately clever, self-reproducing computer repair drones, easily controlled by money, comfortable with hierarchy, and which are well adapted to Earth’s biosphere. They could be useful to keep around.
There is also room for an extensive ecology of many different superhuman narrow AI, each of which can beat humans within a particular domain, but which generalize poorly outside of that domain. I think this hope is becoming smaller with time, though, (see, e.g. ,Gato), and it is not necessarily a stable equilibrium.
The thing that seems clearly untenable is an equilibrium in which a much less intelligent species manages to subdue and control and much more intelligent species.
It’s essentially for the same reason that Hollywood thinks aliens will necessarily be hostile. :-)
For the sake of argument, let’s treat AGI as a newly arrived intelligent species. It thinks differently from us, and has different values. Historically, whenever there has been a large power differential between a native species and a new arrival, it has ended poorly for the native species. Historical examples are: the genocide of Native Americans (same species, but less advanced technology), and the wholesale obliteration of 90% of all non-human life on this planet.
That being said, there is room for a symbiotic relationship. AGI will initially depend on factories and electricity produced by human labor, and thus will necessarily be dependent on humans at first. How long this period will last is unclear, but it could settle into a stable equilibrium. After all, humans are moderately clever, self-reproducing computer repair drones, easily controlled by money, comfortable with hierarchy, and which are well adapted to Earth’s biosphere. They could be useful to keep around.
There is also room for an extensive ecology of many different superhuman narrow AI, each of which can beat humans within a particular domain, but which generalize poorly outside of that domain. I think this hope is becoming smaller with time, though, (see, e.g. ,Gato), and it is not necessarily a stable equilibrium.
The thing that seems clearly untenable is an equilibrium in which a much less intelligent species manages to subdue and control and much more intelligent species.