Makes sense. I think we don’t disagree dramatically then.
I also think TAI is a less important category for me than x-risk inducing AI.
Also makes sense—just checking, does x-risk-inducing AI roughly match the concept of “AI-induced potential point of no return” or is it importantly different? It’s certainly less of a mouthful so if it means roughly the same thing maybe I’ll switch terms. :)
um sorta modulo a type error… risk is risk. It doesn’t mean the thing has happened (we need to start using some sort of phrase like “x-event” or something for that, I think).
I’ve started using the phrase “existential catastrophe” in my thinking about this; “x-catastrophe” doesn’t really have much of a ring to it though, so maybe we need something else that abbreviates better?
Makes sense. I think we don’t disagree dramatically then.
Also makes sense—just checking, does x-risk-inducing AI roughly match the concept of “AI-induced potential point of no return” or is it importantly different? It’s certainly less of a mouthful so if it means roughly the same thing maybe I’ll switch terms. :)
um sorta modulo a type error… risk is risk. It doesn’t mean the thing has happened (we need to start using some sort of phrase like “x-event” or something for that, I think).
I’ve started using the phrase “existential catastrophe” in my thinking about this; “x-catastrophe” doesn’t really have much of a ring to it though, so maybe we need something else that abbreviates better?