You’re referring the Turing Test as a criterion and accusing us of anthroporphizing AI?
I think that an AI might become intelligent enough to destroy the human species, and still not be able to pass the Turing Test. Same way that we don’t need to mimic whales or apes before being able to kill them.
It’s not us who are anthroporphizing AI, it’s you who’re antroporphizing “intelligence that rivals and eventually surpasses the human intelligence both in magnitude and scope”.
You’re referring the Turing Test as a criterion and accusing us of anthroporphizing AI?
I think that an AI might become intelligent enough to destroy the human species, and still not be able to pass the Turing Test. Same way that we don’t need to mimic whales or apes before being able to kill them.
It’s not us who are anthroporphizing AI, it’s you who’re antroporphizing “intelligence that rivals and eventually surpasses the human intelligence both in magnitude and scope”.