You’re careful here to talk about transformative AI rather than AGI, and I think that’s right. GPT-N does seem like it stands to have transformative effects without necessarily being AGI, and that is quite worrisome. I think many of us expected to find ourselves in a world where AGI was primarily what we had to worry about, and instead we’re in a world where “lesser” AI is on track to be powerful enough to dramatically change society. Or at least, so it seems from where we stand, extracting out the trends.
I didn’t say GPT-N is more worrying than AGI, I’m saying I’m surprised we near term have to worry or be concerned about GPT-N in a way I (and I think many others) expected only to have to worry about things we would all agree were AGI.
You’re careful here to talk about transformative AI rather than AGI, and I think that’s right. GPT-N does seem like it stands to have transformative effects without necessarily being AGI, and that is quite worrisome. I think many of us expected to find ourselves in a world where AGI was primarily what we had to worry about, and instead we’re in a world where “lesser” AI is on track to be powerful enough to dramatically change society. Or at least, so it seems from where we stand, extracting out the trends.
Why do you think “lesser” AI being transformative is more worrying than AGI? This scenario seems similar to past technological progress.
I didn’t say GPT-N is more worrying than AGI, I’m saying I’m surprised we near term have to worry or be concerned about GPT-N in a way I (and I think many others) expected only to have to worry about things we would all agree were AGI.
I see, thanks for clarifying!