I think something else is going on. The responses to this question about the feasibility of strong AI mostly stated that it was possible, though selection bias is probably largely at play, as knowledgable people would be more likely to answer than the ignorant would be.
Surely AI is a concept that’s more and more present in the Western culture, but only as fictional, as far as I can tell. No man in the street takes it seriously, as in “it’s really starting to happen”. Possibly the media are paving the way for a change in that, as the insurgence of AI related movies seems to suggest, but I would bet it’s still an idea very far from their realm of possibilities. Also, once the reality of an AI would be estabilished, it would still be a jump to believe in the possibility of an intelligence superior to human’s, a leap that for me is tiny but for many I suspect would not be so small (self-importance and all that).
One answer could be that people don’t really think that a superintelligence is possible. It doesn’t even enter in their model of the world.
Like this? https://youtube.com/watch?v=xKk4Cq56d1Y
I think something else is going on. The responses to this question about the feasibility of strong AI mostly stated that it was possible, though selection bias is probably largely at play, as knowledgable people would be more likely to answer than the ignorant would be.
Surely AI is a concept that’s more and more present in the Western culture, but only as fictional, as far as I can tell.
No man in the street takes it seriously, as in “it’s really starting to happen”. Possibly the media are paving the way for a change in that, as the insurgence of AI related movies seems to suggest, but I would bet it’s still an idea very far from their realm of possibilities. Also, once the reality of an AI would be estabilished, it would still be a jump to believe in the possibility of an intelligence superior to human’s, a leap that for me is tiny but for many I suspect would not be so small (self-importance and all that).
But other than self-importance, why don’t people take it seriously? Is it otherwise just due to the absurdity and availability heuristics?