I don’t think of it as “AGI” or “human-level” being an especially bad term—most category nouns are bad terms (like “heap”), in the sense that they’re inherently fuzzy gestures at the structure of the world. It’s just that in the context of 2024, we’re now inside the fuzz.
A mile away from your house, “towards your house” is a useful direction. Inside your front hallway, “towards your house” is a uselessly fuzzy direction—and a bad term. More precision is needed because you’re closer.
I also like “transformative AI.”
I don’t think of it as “AGI” or “human-level” being an especially bad term—most category nouns are bad terms (like “heap”), in the sense that they’re inherently fuzzy gestures at the structure of the world. It’s just that in the context of 2024, we’re now inside the fuzz.
A mile away from your house, “towards your house” is a useful direction. Inside your front hallway, “towards your house” is a uselessly fuzzy direction—and a bad term. More precision is needed because you’re closer.
This is an excellent short mental handle for this concept. I’ll definitely be using it.