I agree. But then again, that’s true by definition of ‘AGI’ and ‘ASI’.
However, it’s not even clear that the ‘G’ in ‘AGI’ is a well-defined notion in the first place. What does it even mean to be a ‘general’ intelligence? Usually people use the term to mean something like the old definition of ‘Strong AI’, i.e. something that equates to human intelligence in some sense—but even the task human brains implement is not “general” in any real sense. It’s just the peculiar task we call ‘being a human’, the result of an extraordinarily capable aggregate of narrow intelligences!
I agree. But then again, that’s true by definition of ‘AGI’ and ‘ASI’.
However, it’s not even clear that the ‘G’ in ‘AGI’ is a well-defined notion in the first place. What does it even mean to be a ‘general’ intelligence? Usually people use the term to mean something like the old definition of ‘Strong AI’, i.e. something that equates to human intelligence in some sense—but even the task human brains implement is not “general” in any real sense. It’s just the peculiar task we call ‘being a human’, the result of an extraordinarily capable aggregate of narrow intelligences!