Perhaps it’s mainly a matter of perceptions, where “AI risk” typically brings to mind a particular doomsday scenario, instead of a spread of possibilities that includes posthuman value drift, which is also not helped by the fact that around here we talk much more about UFAI going FOOM than the other scenarios. Given this, do you think we should perhaps favor phrases like “Singularity-related risks and opportunities” where appropriate?
I have the opposite perception, that “Singularity” is worse than “artificial intelligence.” If you want to avoid talking about FOOM, “Singularity” has more connotation of that than AI in my perception.
I’m also not sure exactly what you mean by the “single scenario” getting privileged, or where you would draw the lines. In the Yudkowsky-Hanson debate and elsewhere Eliezer talked about many separate posthuman AIs coordinating to divvy up the universe without giving humanity or humane values a share, about monocultures of seemingly separate AIs with shared values derived from a common ancestor, and so forth. Whole brain emulations coming first, which then invent AIs that race ahead of the WBEs were discussed, and so forth.
I have the opposite perception, that “Singularity” is worse than “artificial intelligence.”
I see… I’m not sure what to suggest then. Anyone else have ideas?
I’m also not sure exactly what you mean by the “single scenario” getting privileged, or where you would draw the lines.
I think the scenario that “AI risk” tends to bring to mind is a de novo or brain-inspired AGI (excluding uploads) rapidly destroying human civilization. Here are a couple of recent posts along these lines and using the phrase “AI risk”.
“Posthumanity” or “posthuman intelligence” or something of the sort might be an accurate summary of the class of events you have in mind, but it sounds a lot less respectable than “AI”. (Though maybe not less respectable than “Singularity”?)
How about “Threats and Opportunities Associated With Profound Sociotechnological Change”, and maybe shortened to “future-tech threats and opportunities” in informal use?
Perhaps it’s mainly a matter of perceptions, where “AI risk” typically brings to mind a particular doomsday scenario, instead of a spread of possibilities that includes posthuman value drift, which is also not helped by the fact that around here we talk much more about UFAI going FOOM than the other scenarios. Given this, do you think we should perhaps favor phrases like “Singularity-related risks and opportunities” where appropriate?
I have the opposite perception, that “Singularity” is worse than “artificial intelligence.” If you want to avoid talking about FOOM, “Singularity” has more connotation of that than AI in my perception.
I’m also not sure exactly what you mean by the “single scenario” getting privileged, or where you would draw the lines. In the Yudkowsky-Hanson debate and elsewhere Eliezer talked about many separate posthuman AIs coordinating to divvy up the universe without giving humanity or humane values a share, about monocultures of seemingly separate AIs with shared values derived from a common ancestor, and so forth. Whole brain emulations coming first, which then invent AIs that race ahead of the WBEs were discussed, and so forth.
I see… I’m not sure what to suggest then. Anyone else have ideas?
I think the scenario that “AI risk” tends to bring to mind is a de novo or brain-inspired AGI (excluding uploads) rapidly destroying human civilization. Here are a couple of recent posts along these lines and using the phrase “AI risk”.
utilitymonster’s What is the best compact formalization of the argument for AI risk from fast takeoff?
XiXiDu’s A Primer On Risks From AI
ETA: See also lukeprog’s Facing the Singularity, which talks about this AI risk and none of the other ones you consider to be “AI risk”
“Posthumanity” or “posthuman intelligence” or something of the sort might be an accurate summary of the class of events you have in mind, but it sounds a lot less respectable than “AI”. (Though maybe not less respectable than “Singularity”?)
How about “Threats and Opportunities Associated With Profound Sociotechnological Change”, and maybe shortened to “future-tech threats and opportunities” in informal use?