My personal opinion: climate change (and more directly, conflict caused or exacerbated by it) is the single biggest risk to human-like intelligence flourishing in the galaxy—very likely that it’s a large component of the Great Filter.
I don’t think that the idea of the Great Filter fits very well here. The Great Filter would be something so universal that it eliminates ~100% of all civilizations. Climate change seems to be conditional on a number of factors specific to earth, e.g. carbon-based life, green-house gas effects, interdependent civilization etc., that it doesn’t really work well as a factor that eliminates nearly all civilizations at a specific level of development.
My suspicion is that it generalizes well beyond mechanisms of greenhouse gasses or temperature ranges. The path from “able to manipulate a civilization’s environment at scale” to “able to modulate use of resources in order not to destroy said civilization”, with an added element of “over-optimization for a given environment rendering a nascent civilization extremely vulnerable to changes in their environment” could easily be universal problems.
It’s the fragility that worries me most—I believe that if we could remain calm and coordinate the application of mitigations, we could make it through most of the projected changes. But I don’t believe that we CAN remain calm—I suspect (and fear) that humans will react violently to any significant future changes, and our civilization will turn out to be much much easier to destroy than to maintain.
Regardless of whether it’s universal, that’s the x-risk I see to our brand of human-like intelligent experiences. Not climate change directly, but war and destruction about how to slow it down, and over who gets the remaining nice bits as it gets worse.
I don’t think that the idea of the Great Filter fits very well here. The Great Filter would be something so universal that it eliminates ~100% of all civilizations. Climate change seems to be conditional on a number of factors specific to earth, e.g. carbon-based life, green-house gas effects, interdependent civilization etc., that it doesn’t really work well as a factor that eliminates nearly all civilizations at a specific level of development.
My suspicion is that it generalizes well beyond mechanisms of greenhouse gasses or temperature ranges. The path from “able to manipulate a civilization’s environment at scale” to “able to modulate use of resources in order not to destroy said civilization”, with an added element of “over-optimization for a given environment rendering a nascent civilization extremely vulnerable to changes in their environment” could easily be universal problems.
It’s the fragility that worries me most—I believe that if we could remain calm and coordinate the application of mitigations, we could make it through most of the projected changes. But I don’t believe that we CAN remain calm—I suspect (and fear) that humans will react violently to any significant future changes, and our civilization will turn out to be much much easier to destroy than to maintain.
Regardless of whether it’s universal, that’s the x-risk I see to our brand of human-like intelligent experiences. Not climate change directly, but war and destruction about how to slow it down, and over who gets the remaining nice bits as it gets worse.
An angle that’s interesting (though only tangentially connected with climate change) is how civilizations deal with waste heat.