So the following, for example, don’t count as “existential risk caused by AGI”, right?
many AIs
an economy run by advanced AIs amplifying negative externalities, such as pollution, leading to our demise
an em world with minds evolving to the point of being non-valuable anymore (“a Disneyland without children”)
a war by transcending uploads
narrow AI
a narrow AI killing all humans (ex.: by designing grey goo, a virus, etc.)
a narrow AI eroding trust in society until it breaks apart
intermediary cause by an AGI, but not ultimate cause
a simulation shutdown because our AI didn’t have a decision theory for acausal cooperation
an AI convincing a human to destroy the world
So the following, for example, don’t count as “existential risk caused by AGI”, right?
many AIs
an economy run by advanced AIs amplifying negative externalities, such as pollution, leading to our demise
an em world with minds evolving to the point of being non-valuable anymore (“a Disneyland without children”)
a war by transcending uploads
narrow AI
a narrow AI killing all humans (ex.: by designing grey goo, a virus, etc.)
a narrow AI eroding trust in society until it breaks apart
intermediary cause by an AGI, but not ultimate cause
a simulation shutdown because our AI didn’t have a decision theory for acausal cooperation
an AI convincing a human to destroy the world