“The set of humans minds might possibly be outside of the set of possible Strong AI minds”
Uh, you know what the ‘A’ in ‘Strong AI’ stands for, don’t you?
You may choose to ignore the etymology of the term, and include humans in the set of Strong AIs, but that’s not the generally used definition of the term, and I’m sure that the original poster, the poster I responded to, and pretty much everybody else on this thread was referring to non-human intelligences.
Therefore, my points stands: if you were to exactly replicate all of the features of a human, you would have created a human, not a non-human intelligence.
If I replicate the brain algorithm of a human, but I do it in some other form (e.g. as a computer program, instead of using carbon based molecules), is that an “AI”?
If I make something very very similar, but not identical to the brain algorithm of a human, but I do it in some other form (e.g. as a computer program, instead of using carbon based molecules), is that an “AI?”
Its a terminology discussion at this point, I think.
In my original reply my intent was “provided that there are no souls/inputs from outside the universe required to make a functioning human, then we are able to create an AI by building something functionally equivalent to a human, and therefore strong AI is possible”.
If I replicate the brain algorithm of a human, but I do it in some other form (e.g. as a computer program, instead of using carbon based molecules), is that an “AI”?
Possibly, that’s a borderline case.
If I make something very very similar, but not identical to the brain algorithm of a human, but I do it in some other form (e.g. as a computer program, instead of using carbon based molecules), is that an “AI?”
In my original reply my intent was “provided that there are no souls/inputs from outside the universe required to make a functioning human, then we are able to create an AI by building something functionally equivalent to a human, and therefore strong AI is possible”.
Even if humans are essentially computable, in a theoretical sense, it doesnt follow that it is physically possible to build something functionally equivalent on a different type of hardware, under practical constraints. Think of running Google on a mechanical computer like Babbage’s Analytical Engine.
“The set of humans minds might possibly be outside of the set of possible Strong AI minds”
Uh, you know what the ‘A’ in ‘Strong AI’ stands for, don’t you?
You may choose to ignore the etymology of the term, and include humans in the set of Strong AIs, but that’s not the generally used definition of the term, and I’m sure that the original poster, the poster I responded to, and pretty much everybody else on this thread was referring to non-human intelligences.
Therefore, my points stands: if you were to exactly replicate all of the features of a human, you would have created a human, not a non-human intelligence.
If I replicate the brain algorithm of a human, but I do it in some other form (e.g. as a computer program, instead of using carbon based molecules), is that an “AI”?
If I make something very very similar, but not identical to the brain algorithm of a human, but I do it in some other form (e.g. as a computer program, instead of using carbon based molecules), is that an “AI?”
Its a terminology discussion at this point, I think.
In my original reply my intent was “provided that there are no souls/inputs from outside the universe required to make a functioning human, then we are able to create an AI by building something functionally equivalent to a human, and therefore strong AI is possible”.
Possibly, that’s a borderline case.
Even if humans are essentially computable, in a theoretical sense, it doesnt follow that it is physically possible to build something functionally equivalent on a different type of hardware, under practical constraints.
Think of running Google on a mechanical computer like Babbage’s Analytical Engine.