One of the ways we can kill ourselves is global warming. Replacing coal power with solar power will reduce one of the causes of global warming—namely, the greenhouse gases emitted from coal plants.
How likely is it for global warming to be an existential risk threat? This seems unlikely. It may well be that global warming will contribute to existential risk in a marginal fashion if it forces less resources to be spent on existential risk issues or makes war more likely, but that seems like a much more roundabout issue, and by that logic many other technologies would fall into the same category.
It depends what you mean by an existential threat.
I think there’s a reasonable chance that global warming (combined with other factors; biosphere degradation, resource depletion, unsustainable farming, lack of fresh water, increasing war over increasingly limited resources, ect), may cause our current civilization to collapse.
If our civilization collapses, what are the odds that we’ll recover, and eventually get back up to where we are now? I don’t know, but if our civilization collapses and we’re left without modern tools in a world in the middle of an ongoing mass extinction that we started, things start to look really dodgy. In any case, we don’t know what percentage of intelligent species go from being merely intelligent to having an advanced technology; it could be that we just passed The Great Filter in the past 200 years or so (say, at the moment the industrial revolution started), in which case losing that advance and passing back through it in the other direction would dramatically lower our chances of becoming a space-faring civilization.
Of course, if we reach a sufficiently high level of technology before the other problems I talked about kick in, then they’re all solvable.
It doesn’t have to turn the Earth into Venus for unusually-rapid climate shifts to destabilize geopolitics badly, exceed system tolerances in infrastructure, consume an ever-growing portion of the economy either by increased loss and waste or in efforts at mitigation, and thereby effectively amp up the power of many other forms of global X-risk (or just contribute directly to X-risk by numerous small factors, none of which would by itself be capable of overwhelming the system, but which collectively undermine it).
And that increases the odds we’ll survive until a singularity.
How does it substantially impact that probability?
One of the ways we can kill ourselves is global warming. Replacing coal power with solar power will reduce one of the causes of global warming—namely, the greenhouse gases emitted from coal plants.
How likely is it for global warming to be an existential risk threat? This seems unlikely. It may well be that global warming will contribute to existential risk in a marginal fashion if it forces less resources to be spent on existential risk issues or makes war more likely, but that seems like a much more roundabout issue, and by that logic many other technologies would fall into the same category.
It depends what you mean by an existential threat.
I think there’s a reasonable chance that global warming (combined with other factors; biosphere degradation, resource depletion, unsustainable farming, lack of fresh water, increasing war over increasingly limited resources, ect), may cause our current civilization to collapse.
If our civilization collapses, what are the odds that we’ll recover, and eventually get back up to where we are now? I don’t know, but if our civilization collapses and we’re left without modern tools in a world in the middle of an ongoing mass extinction that we started, things start to look really dodgy. In any case, we don’t know what percentage of intelligent species go from being merely intelligent to having an advanced technology; it could be that we just passed The Great Filter in the past 200 years or so (say, at the moment the industrial revolution started), in which case losing that advance and passing back through it in the other direction would dramatically lower our chances of becoming a space-faring civilization.
Of course, if we reach a sufficiently high level of technology before the other problems I talked about kick in, then they’re all solvable.
That doesn’t seem even remotely likely; as I understand it, the Earth has been much hotter than now many times without turning into Venus.
It doesn’t have to turn the Earth into Venus for unusually-rapid climate shifts to destabilize geopolitics badly, exceed system tolerances in infrastructure, consume an ever-growing portion of the economy either by increased loss and waste or in efforts at mitigation, and thereby effectively amp up the power of many other forms of global X-risk (or just contribute directly to X-risk by numerous small factors, none of which would by itself be capable of overwhelming the system, but which collectively undermine it).