I think the current record is suggestive that Neanderthals were more individually intelligent than the anatomically modern humans that displaced them, tho obviously anatomically modern human society was more capable than Neanderthal society (through a combination of superior communication ability, trade, and higher population density). [I think it’s quite likely that Neanderthals were less intelligent than humans currently alive today.]
We also have some evidence that we’ve selectively destroyed the intelligence of other animals as part of domestication; wolves perform better than dogs on causal reasoning tasks (of the sort that would make them more obnoxious as pets; many people want a ‘smart’ dog and then discover how difficult it is to get them to take their medicine, or keep them out of the treat drawer, and so on). So even if most animals around today are ‘dumb’, it may be because ‘we wanted it that way.’
When it comes to AI, tho, I think the right analogy is not “individual human vs. individual AI” but “human civilization (with computers)” vs. “AI civilization”. It seems pretty clear to me that most of the things to worry about with AGI look like it getting the ability to do ‘cultural accumulation’, at least with regards to technological knowledge, and aren’t related to “it has a much higher working memory!” or other sorts of individual intelligence tasks. In this lens, the superiority of artificial intelligence is mostly cultural superiority. (For example, this description of AI takeoff hinges not on superior individual ability, but enhanced ‘cultural learning’.)
When it comes to AI, tho, I think the right analogy is not “individual human vs. individual AI” but “human civilization (with computers)” vs. “AI civilization”. It seems pretty clear to me that most of the things to worry about with AGI look like it getting the ability to do ‘cultural accumulation’, at least with regards to technological knowledge, and aren’t related to “it has a much higher working memory!” or other sorts of individual intelligence tasks. In this lens, the superiority of artificial intelligence is mostly cultural superiority.
I agree with this, but I’m mostly interested in this question because the abilities of AI culture depend quite heavily on the abilities of individual AI systems.
I don’t see why this is a case for AI systems in a different way then it is for humans.
The fact that an AI can easily clone specific instances of it makes it much faster for it to spread culture. We humans can’t simply make 100,000 copies of Elon Musk but if Elon would be an AI that would be easy.
I think the current record is suggestive that Neanderthals were more individually intelligent than the anatomically modern humans that displaced them, tho obviously anatomically modern human society was more capable than Neanderthal society (through a combination of superior communication ability, trade, and higher population density). [I think it’s quite likely that Neanderthals were less intelligent than humans currently alive today.]
We also have some evidence that we’ve selectively destroyed the intelligence of other animals as part of domestication; wolves perform better than dogs on causal reasoning tasks (of the sort that would make them more obnoxious as pets; many people want a ‘smart’ dog and then discover how difficult it is to get them to take their medicine, or keep them out of the treat drawer, and so on). So even if most animals around today are ‘dumb’, it may be because ‘we wanted it that way.’
When it comes to AI, tho, I think the right analogy is not “individual human vs. individual AI” but “human civilization (with computers)” vs. “AI civilization”. It seems pretty clear to me that most of the things to worry about with AGI look like it getting the ability to do ‘cultural accumulation’, at least with regards to technological knowledge, and aren’t related to “it has a much higher working memory!” or other sorts of individual intelligence tasks. In this lens, the superiority of artificial intelligence is mostly cultural superiority. (For example, this description of AI takeoff hinges not on superior individual ability, but enhanced ‘cultural learning’.)
I agree with this, but I’m mostly interested in this question because the abilities of AI culture depend quite heavily on the abilities of individual AI systems.
I don’t see why this is a case for AI systems in a different way then it is for humans.
The fact that an AI can easily clone specific instances of it makes it much faster for it to spread culture. We humans can’t simply make 100,000 copies of Elon Musk but if Elon would be an AI that would be easy.