“AI” is really all of mindspace except the tiny human dot. There’s an article about it around here somewhere. PLENTY of AIs are indeed correctly incorporated in “us”, and indeed unless things go horribly wrong “what we now think of as humans” will be extinct and replaced with these wast and alien things. Think of daleks and GLADoS and chuthulu and Babyeaters here. These are mostly as close to friendly as most humans are, and we’re trusting humans to make the seed FAI in the first place.
Unfiendly AI are not like that. The process of evolution itself is basically a very stupid UFAI. Or a pandemic. or the intuition pump in this article http://lesswrong.com/lw/ld/the_hidden_complexity_of_wishes/ . Or even something like a supernova. It’s not a character, not even an “evil” one.
((yea this is a gross oversimplification, I’m aiming mostly at causing true intuitions here, not causing true explicit beliefs. The phenomena is related to metaphor.))
“AI” is really all of mindspace except the tiny human dot. There’s an article about it around here somewhere. PLENTY of AIs are indeed correctly incorporated in “us”, and indeed unless things go horribly wrong “what we now think of as humans” will be extinct and replaced with these wast and alien things. Think of daleks and GLADoS and chuthulu and Babyeaters here. These are mostly as close to friendly as most humans are, and we’re trusting humans to make the seed FAI in the first place.
Unfiendly AI are not like that. The process of evolution itself is basically a very stupid UFAI. Or a pandemic. or the intuition pump in this article http://lesswrong.com/lw/ld/the_hidden_complexity_of_wishes/ . Or even something like a supernova. It’s not a character, not even an “evil” one.
((yea this is a gross oversimplification, I’m aiming mostly at causing true intuitions here, not causing true explicit beliefs. The phenomena is related to metaphor.))