a generalization of existential risk that includes the extinction of alien intelligences or the drastic decrease of their potential
I think the term is Great Filter.
G0W51 is talking about universal x-risk versus local x-risk. Global thermonuclear war would be relevant for the great filter, but doesn’t endanger anyone else in the universe. Whereas if Earth creates UFAI, that’s bad for everyone in our light cone.
True. Also, the Great Filter is more akin to an existential catastrophe than to existential risk, that is, the risk of an existential catastrophe.
I think the term is Great Filter.
G0W51 is talking about universal x-risk versus local x-risk. Global thermonuclear war would be relevant for the great filter, but doesn’t endanger anyone else in the universe. Whereas if Earth creates UFAI, that’s bad for everyone in our light cone.
True. Also, the Great Filter is more akin to an existential catastrophe than to existential risk, that is, the risk of an existential catastrophe.