Is there a term for a generalization of existential risk that includes the extinction of alien intelligences or the drastic decrease of their potential? Existential risk, that is, the extinction of Earth-originating intelligent life or the drastic decrease of its potential, does not sound nearly as harmful if there are alien civilizations that become sufficiently advanced in place of Earth-originating life. However, an existential risk sounds far more harmful if it compromises all intelligent life in the universe, or if there is no other intelligent life in the universe to begin with. Perhaps this would make physics experiments more concerning than other existential risks, as even if their chance of causing the extincion of Earth-originating life is much smaller than other existential risks, their chance of eliminating all life in the universe may be higher.
I really like this distinction. The closest I’ve seen is discussion of existential risk from a non-anthropocentric perspective. I suppose the neologism would be panexistential risk.
G0W51 is talking about universal x-risk versus local x-risk. Global thermonuclear war would be relevant for the great filter, but doesn’t endanger anyone else in the universe. Whereas if Earth creates UFAI, that’s bad for everyone in our light cone.
Is there a term for a generalization of existential risk that includes the extinction of alien intelligences or the drastic decrease of their potential? Existential risk, that is, the extinction of Earth-originating intelligent life or the drastic decrease of its potential, does not sound nearly as harmful if there are alien civilizations that become sufficiently advanced in place of Earth-originating life. However, an existential risk sounds far more harmful if it compromises all intelligent life in the universe, or if there is no other intelligent life in the universe to begin with. Perhaps this would make physics experiments more concerning than other existential risks, as even if their chance of causing the extincion of Earth-originating life is much smaller than other existential risks, their chance of eliminating all life in the universe may be higher.
I really like this distinction. The closest I’ve seen is discussion of existential risk from a non-anthropocentric perspective. I suppose the neologism would be panexistential risk.
Panexistential risk is a good, intuitive, name.
I think the term is Great Filter.
G0W51 is talking about universal x-risk versus local x-risk. Global thermonuclear war would be relevant for the great filter, but doesn’t endanger anyone else in the universe. Whereas if Earth creates UFAI, that’s bad for everyone in our light cone.
True. Also, the Great Filter is more akin to an existential catastrophe than to existential risk, that is, the risk of an existential catastrophe.