What? I don’t follow. Are you saying it would be a much better world if an unfriendly AI replaced humanity? I don’t think it’s luddite-ish to say I’d rather not die so something else can take my place.
I’d agree to AI “unfriendly” (whatever this means… it shouldn’t reason emotionally, it should just be sufficiently intelligent) replacing humanity… since we are the problem that we’re trying to solve. We feel pain, we suffer, we are stupid, susceptible to countless diseases, we aren’t very happy and fulfilled, etc. Eventually we’ll all need to be either corrected or replaced. An old computer can only take so many software updates before it becomes incompatible with newer operating systems, and this is our eventual fate. It is not logical to be against our own demise, in my viewpoint.
What? I don’t follow. Are you saying it would be a much better world if an unfriendly AI replaced humanity? I don’t think it’s luddite-ish to say I’d rather not die so something else can take my place.
I’d agree to AI “unfriendly” (whatever this means… it shouldn’t reason emotionally, it should just be sufficiently intelligent) replacing humanity… since we are the problem that we’re trying to solve. We feel pain, we suffer, we are stupid, susceptible to countless diseases, we aren’t very happy and fulfilled, etc. Eventually we’ll all need to be either corrected or replaced. An old computer can only take so many software updates before it becomes incompatible with newer operating systems, and this is our eventual fate. It is not logical to be against our own demise, in my viewpoint.