Climate change is exactly the kind of problem that a functional civilization should be able to solve on its own, without AGI as a crutch.
Until a few years ago, we were doing a bunch of geoengineering by accident, and the technology required to stop emitting a bunch of greenhouse gases in the first place (nuclear power) has been mature for decades.
I guess you could have an AGI help with lobbying or public persuasion / education. But that seems like a very “everything looks like a nail” approach to problem solving, before you even have the supposed tool (AGI) to actually use.
We use fossil fuels for a lot more than energy and there’s more to climate change than fossil fuel emissions. Energy usage is roughly 75% of emissions. 25% of oil is used for manufacturing. My impression is we are way over targets for fossil fuel usage that would result in reasonable global warming. Furthermore, a lot of green energy will be a hard sell to developing nations.
Maybe replacing as much oil with nuclear as politically feasible reduces it but does it reduce it enough? Current models[1] assume we invent carbon capture technology somewhere down the line, so things are looking dire.
It’s clear we have this idea that we will partially solve this issue in time with engineering, and it does seem that way if you look at history. However, recent history has the advantage that there was constant population growth with an emphasis on new ideas and entrepreneurship. If you look at what happened to a country like Japan, when age pyramids shifted, you can see that the country gets stuck in backward tech as society restructures itself to take care of the elderly.
So I think any assumptions that we will have exponential technological progress are “trend chasing” per se. A lot of our growth curves almost require mass automation or AGI to work. Without that you probably get stagnation. Economists have projected this in 2015 and it seems not much has changed since. [2]. Now [3].
I think it’s fine to have the opinion that AGI risk of failure could be higher than the risks from stagnation and other existential risks, but I also think having an unnecessarily rose tinted view of progress isn’t accurate. For example, you may be overestimating AGI risk relative to other risks in that case.
Climate change is exactly the kind of problem that a functional civilization should be able to solve on its own, without AGI as a crutch.
Until a few years ago, we were doing a bunch of geoengineering by accident, and the technology required to stop emitting a bunch of greenhouse gases in the first place (nuclear power) has been mature for decades.
I guess you could have an AGI help with lobbying or public persuasion / education. But that seems like a very “everything looks like a nail” approach to problem solving, before you even have the supposed tool (AGI) to actually use.
We use fossil fuels for a lot more than energy and there’s more to climate change than fossil fuel emissions. Energy usage is roughly 75% of emissions. 25% of oil is used for manufacturing. My impression is we are way over targets for fossil fuel usage that would result in reasonable global warming. Furthermore, a lot of green energy will be a hard sell to developing nations.
Maybe replacing as much oil with nuclear as politically feasible reduces it but does it reduce it enough? Current models[1] assume we invent carbon capture technology somewhere down the line, so things are looking dire.
It’s clear we have this idea that we will partially solve this issue in time with engineering, and it does seem that way if you look at history. However, recent history has the advantage that there was constant population growth with an emphasis on new ideas and entrepreneurship. If you look at what happened to a country like Japan, when age pyramids shifted, you can see that the country gets stuck in backward tech as society restructures itself to take care of the elderly.
So I think any assumptions that we will have exponential technological progress are “trend chasing” per se. A lot of our growth curves almost require mass automation or AGI to work. Without that you probably get stagnation. Economists have projected this in 2015 and it seems not much has changed since. [2]. Now [3].
I think it’s fine to have the opinion that AGI risk of failure could be higher than the risks from stagnation and other existential risks, but I also think having an unnecessarily rose tinted view of progress isn’t accurate. For example, you may be overestimating AGI risk relative to other risks in that case.
https://newrepublic.com/article/165996/carbon-removal-cdr-ipcc-climate-change
https://www.mckinsey.com/featured-insights/employment-and-growth/can-long-term-global-growth-be-saved
https://www.conference-board.org/topics/global-economic-outlook#:~:text=Real GDP growth%2C 2023 (%25 change)&text=Global real GDP is forecasted,to 2.5 percent in 2024.