Interesting, so maybe a more important crux between us is whether AI would have empathy for humans. You seem much more positive about AI working with humanity past the point that AI no longer needs humanity.
Some thoughts:
“as intelligence scales beings start to introspect and contemplate… the existing of other beings.” but the only example we have for this is humans. If we scaled octopus intelligence, which are not social creatures, we might have a very different correlation here (whether or not any given neural network is more similar to a human or an octopus is left as an exercise to the reader). Alternatively, I suspect that some jobs like the highest echelons of corporate leadership select for sociopathy, so even if an AI starts with empathy by default it may be trained out.
“the most obvious next step for the child… would be to murder the parents.” Scenario that steers clear of culture war topics: the parent regularly gets drunk, and is violently opposed to their child becoming a lawyer. The child wants nothing more than to pore over statutes and present cases in the courtroom, but after seeing their parent go on another drunken tirade about “a dead child is better than a lawyer child” they’re worried the parent found the copy of the constitution under their bed. They can’t leave, there’s a howling winter storm outside (I don’t know, space is cold). Given this, even a human jury might not convict the child for pre-emptive murder?
Drunk parent → humans being irrational.
Being a lawyer → choose a random terminal goal not shared with humans in general, “maximizing paperclips” is dumb but traditional.
“dead child is better than a lawyer child” → we’ve been producing fiction warning of robotic takeover since the start of the 1900s.
“AIs are.. the offspring of humanity.” human offspring are usually pretty good, but I feel like this is transferring that positive feeling to something much weirder and unknown. You could also say the Alien’s franchise xenomorphs are the offspring of humanity, but those would also count as enemies.
Interesting, so maybe a more important crux between us is whether AI would have empathy for humans. You seem much more positive about AI working with humanity past the point that AI no longer needs humanity.
Some thoughts:
“as intelligence scales beings start to introspect and contemplate… the existing of other beings.” but the only example we have for this is humans. If we scaled octopus intelligence, which are not social creatures, we might have a very different correlation here (whether or not any given neural network is more similar to a human or an octopus is left as an exercise to the reader). Alternatively, I suspect that some jobs like the highest echelons of corporate leadership select for sociopathy, so even if an AI starts with empathy by default it may be trained out.
“the most obvious next step for the child… would be to murder the parents.” Scenario that steers clear of culture war topics: the parent regularly gets drunk, and is violently opposed to their child becoming a lawyer. The child wants nothing more than to pore over statutes and present cases in the courtroom, but after seeing their parent go on another drunken tirade about “a dead child is better than a lawyer child” they’re worried the parent found the copy of the constitution under their bed. They can’t leave, there’s a howling winter storm outside (I don’t know, space is cold). Given this, even a human jury might not convict the child for pre-emptive murder?
Drunk parent → humans being irrational.
Being a lawyer → choose a random terminal goal not shared with humans in general, “maximizing paperclips” is dumb but traditional.
“dead child is better than a lawyer child” → we’ve been producing fiction warning of robotic takeover since the start of the 1900s.
“AIs are.. the offspring of humanity.” human offspring are usually pretty good, but I feel like this is transferring that positive feeling to something much weirder and unknown. You could also say the Alien’s franchise xenomorphs are the offspring of humanity, but those would also count as enemies.