G0W51 is talking about universal x-risk versus local x-risk. Global thermonuclear war would be relevant for the great filter, but doesn’t endanger anyone else in the universe. Whereas if Earth creates UFAI, that’s bad for everyone in our light cone.
True. Also, the Great Filter is more akin to an existential catastrophe than to existential risk, that is, the risk of an existential catastrophe.
G0W51 is talking about universal x-risk versus local x-risk. Global thermonuclear war would be relevant for the great filter, but doesn’t endanger anyone else in the universe. Whereas if Earth creates UFAI, that’s bad for everyone in our light cone.
True. Also, the Great Filter is more akin to an existential catastrophe than to existential risk, that is, the risk of an existential catastrophe.