Superintelligence
To me, superintelligence implies qualitatively much smarter than the best humans. I don’t think this is needed for AI to be transformative. Fast and cheap-to-run AIs which are as qualitatively smart as humans would likely be transformative.
Agreed—I thought you wanted that term for replacing how OP stated AGI is being used in relation to x-risk.
In terms of “fast and cheap and comparable to the average human”—well, then for a number of roles and niches we’re already there.
Sticking with the intent behind your term, maybe “generally transformative AI” is a more accurate representation for a colloquial ‘AGI’ replacement?
Oh, by “as qualitatively smart as humans” I meant “as qualitatively smart as the best human experts”.
I also maybe disagree with:
Or at least the % of economic activity covered by this still seems low to me.
I think that is more comparable to saying “as smart as humanity.” No individual human is as smart as humanity in general.
To me, superintelligence implies qualitatively much smarter than the best humans. I don’t think this is needed for AI to be transformative. Fast and cheap-to-run AIs which are as qualitatively smart as humans would likely be transformative.
Agreed—I thought you wanted that term for replacing how OP stated AGI is being used in relation to x-risk.
In terms of “fast and cheap and comparable to the average human”—well, then for a number of roles and niches we’re already there.
Sticking with the intent behind your term, maybe “generally transformative AI” is a more accurate representation for a colloquial ‘AGI’ replacement?
Oh, by “as qualitatively smart as humans” I meant “as qualitatively smart as the best human experts”.
I also maybe disagree with:
Or at least the % of economic activity covered by this still seems low to me.
Oh, by “as qualitatively smart as humans” I meant “as qualitatively smart as the best human experts”.
I think that is more comparable to saying “as smart as humanity.” No individual human is as smart as humanity in general.