Why do you seem to imply that burning fossil fuels would help at all the odds of the long term human project?
I don’t imply that. For clarification:
I would waste any number of resources if that was what was best for the long-term prospects of Humanity. In practice, that means that I’m willing to sacrifice really really large amounts of resources that we won’t be able to use until after we develop AGI or similar, in exchange for very very small increases to our probability of developing aligned AGI or similar.
Because I think we won’t be able to use significant portions of most of the types of resources available on Earth before we develop AGI or similar, I’m willing to completely ignore conservation of those resources. I still care about the side effects of the process of gathering and using those resources, but...
The oil example isn’t meant to be any reflection of my affinity for fossil fuels.
My point that “Super long term conservation of resources” isn’t a concern. If there are near term non “conservation of resources” reasons why doing something is bad, I’m open to those concerns- we don’t need to worry about ensuring that humans 100 years from now have access to fuel sources.
For the record, I think nuclear and solar seem to clearly be better energy sources than fossil fuels for most applications. Especially nuclear.
I’m also not fighting defense for climate change activists- I don’t care about how many species die out, unless those species are useful (short term- next 50 years, 100 years max?) to us. If you want to make sure future humanity has access to Tropical Tree Frog #952, and you’re concerned about them going extinct, go grab some genetic samples and preserve them. If the species makes many humans very happy, provides us valuable resources, etc., fine.
At the current rate of fishing, all fish species could be practically extinct by 2050
I’m open to the notion that regulating our fish intake is the responsible move- it seems like a pretty easy sell. It keeps our fishing equipment, boats, and fishermen useful. I’m taking this action because it’s better for humanity, not because it’s better for the fish or better for the Earth.
The Strategy is not to excessively use resources and destroy the environment just because we can, it’s to actively and directly use our resources to accomplish our goals, which I have doubts strongly aligns with preserving the environment.
Let’s list a few ways in which our conservation efforts are bad:
Long term (100+ years) storage of nuclear waste.
Protecting species which aren’t really useful to Humanity.
Planning with the idea that we will be indefinitely (Or, for more than 100 years) living in the current technological paradigm, i.e. without artificial general intelligence.
And in which they’re valid:
Being careful with our harvesting of easily depletable species which we’ll be better off having alive for the next 100 years.
Being careful with our effect on global temperatures and water levels, in order to avoid the costs of relocating large numbers of humans.
Being careful with our management of important freshwater reserves, at least until we develop sufficiently economical desalinization plants.
I personally don’t want to see my personal odds of survival diminishing because I’ll have to deal with riots, food shortages, totalitarian fascist governments or… who know?
The greatest risks to your survival are, by far, (unless you’re a very exceptional person) natural causes and misaligned artificial general intelligence. You shouldn’t significantly concern yourself with dealing with weird risk factors such as riots or food shortages unless you’ve already found that you can’t do anything about natural causes and misaligned artificial general intelligence. Spoiler: It seems you can do something about these risk factors.
Every economical estimate I saw said that the costs would be a lot less than the economic damage from climate change alone, many estimates agree that it would actually improve the economy, and nobody is saying “let’s toss industry and technology out of the window, back to the caves everyone!”.
Many people are saying things I consider dangerously close to “Let’s toss industry and technology out of the window!”. Dagon suggested that our current resource expenditure was reckless, and that we should substantially downgrade our resource expenditures. I consider this to be a seriously questionable perspective on the problem.
I’m not arguing against preserving the environment if it would boost the economy for at least the next 100 years, keeping in mind opportunity cost. I want to improve humanity’s generalized power to pursue its goals- I’m not attached to any particular short guiding principle for doing this, such as “Protect the Earth!” or “More oil!”. I don’t have Mad Oil Baron Syndrome.
Understood, I apologise for misunderstanding your position on fossils fuels. I feel there was a specific attempt from my side to interpret it with that meaning, even if the example used didn’t necessarily implied it was something you endorse, and that it was due to a negative gut reaction I had while reading what you wrote.
We seem to agree on the general principles that humanity technological level will not stay the same for the next hundred years, and that some level of the changes we are producing on the environment are to be avoided to improve mankind future’s condition.
I do feel that allowing the actions of humanity to destroy every part of the environment that hasn’t been proved useful is an engagement in an extremely reckless form of optimism, though.
It’s certainly part of the attitude that got us to the point where being careful with our effect on current temperature levels and avoiding to loose most of our water resources has become a pretty difficult global challenge.
From what I read on industrial regulations so far, in most nations pollutants functionally have to be proven harmful before it can be considered forbidding their release in the environment, and I’m 100% sure it’s at least the current approach in the country most users from this site are.
All in all, our species is nowhere near the point to be immune from the feedbacks our environment can throw at us. By our actions, one third of current animal and vegetable species are currently going extinct.
That is one huge Chesterton Fence we’re tearing down. We simply don’t know in how many way such a change on the system we’re living in can go wrong for us.
I’d agree that the greatest “currently existing risks to my survival” are natural causes. I intend this category as “risks that are actively killing people who are living in similar conditions to my own now”.
However, if we talk about the main “future risks to my survival”, as in “risks that currently are killing a low number of people similar to me, but that could kill a lot more in future years in which I’ll be alive” then I feel that, even if AI mismanagement takes first place, climate change takes the second, and that it augments considerably the chances of the first.
While riots and food shortages are indeed examples I choose by pure gut level evaluation of “scariness” and are too specific to me to put my money on if I should bet on the causes of my death, I don’t feel at all optimistic about the way our society will react to climate change.
Migratory fluxes and violent conflicts a lot smaller than what we’ll certainly see were enough to send the European Union dangerously close to falling apart in a bunch of nationalistic states. Change enough our environment, and wide-scale wars and a new wave of totalitarian governments stop to be an unlikely reality, since in times of fear and unrest people are more likely to regard the principles behind them as positive. All these factors seem to reinforce each other as far as I know.
Even by assuming the situation won’t go as bad as total warfare and rampant totalitarianism, I would bet on a significant degeneration in the political scenario, moving away from international cooperation and toward nationalism and short term interests considerations only, and I don’t really see any reason that a bunch of such states, that are fighting for resources, facing wide scale crisis, scared of what each other will do and have lost most of their ability to cooperate with each other are less likely to botch AI horribly and kill us all.
About the suggestions for lowering our resource consumptions since it’s currently too high: it’s unarguable that we are burning through a ridiculous amount of resources that are producing practically no improvement in our chances of survival or even marginally improving the quality of our life. We could easily keep the same amount of comforts and life expectancy while consuming a lot less resources.
Our economical system has simply not enough incentives for efficiency, shrinking our resources consumption without sacrificing quality of life and life expectancy is perfectly doable and it’s imperative to augment our chances of long term survival.
Lastly, given the current trend of society, statements close to “keeping in check mankind consumption of resources and it’s impact on the environment it’s not a priority” are a lot more dangerous than statements close to “let’s toss industry out of the windows and go back to the caves”. Clearly going too far in either of those directions would hurt, but going too much in the first direction is a possibility a lot more likely at the present moment, while I don’t see any real chance for the second kind of statements to change society toward a pre-technological or pre-industrial site.
The de-growth movement (which, if I remember correctly, it’s based on the proven fact that economic growth, after a certain threshold, offers basically no improvement to quality of life, and that first world has long passed that threshold, so we should focus on things that aren’t economic growth), also doesn’t strike me as a threat to my quality of life or my long term survival comparable to underestimating the impact of environmental damages or of over-consumption of resources before the point when mankind hits a positive singularity.
I also don’t see any real chance of this site moving toward an anti-technological or anti-science trend. Those trends do seem dangerous and likely in the general populace, but for the risks I’ve stated above I think they should be opposed by informing people on the benefits of technology and science, rather than of the industrial system.
I don’t imply that. For clarification:
The oil example isn’t meant to be any reflection of my affinity for fossil fuels.
My point that “Super long term conservation of resources” isn’t a concern. If there are near term non “conservation of resources” reasons why doing something is bad, I’m open to those concerns- we don’t need to worry about ensuring that humans 100 years from now have access to fuel sources.
For the record, I think nuclear and solar seem to clearly be better energy sources than fossil fuels for most applications. Especially nuclear.
I’m also not fighting defense for climate change activists- I don’t care about how many species die out, unless those species are useful (short term- next 50 years, 100 years max?) to us. If you want to make sure future humanity has access to Tropical Tree Frog #952, and you’re concerned about them going extinct, go grab some genetic samples and preserve them. If the species makes many humans very happy, provides us valuable resources, etc., fine.
I’m open to the notion that regulating our fish intake is the responsible move- it seems like a pretty easy sell. It keeps our fishing equipment, boats, and fishermen useful. I’m taking this action because it’s better for humanity, not because it’s better for the fish or better for the Earth.
The Strategy is not to excessively use resources and destroy the environment just because we can, it’s to actively and directly use our resources to accomplish our goals, which I have doubts strongly aligns with preserving the environment.
Let’s list a few ways in which our conservation efforts are bad:
Long term (100+ years) storage of nuclear waste.
Protecting species which aren’t really useful to Humanity.
Planning with the idea that we will be indefinitely (Or, for more than 100 years) living in the current technological paradigm, i.e. without artificial general intelligence.
And in which they’re valid:
Being careful with our harvesting of easily depletable species which we’ll be better off having alive for the next 100 years.
Being careful with our effect on global temperatures and water levels, in order to avoid the costs of relocating large numbers of humans.
Being careful with our management of important freshwater reserves, at least until we develop sufficiently economical desalinization plants.
The greatest risks to your survival are, by far, (unless you’re a very exceptional person) natural causes and misaligned artificial general intelligence. You shouldn’t significantly concern yourself with dealing with weird risk factors such as riots or food shortages unless you’ve already found that you can’t do anything about natural causes and misaligned artificial general intelligence. Spoiler: It seems you can do something about these risk factors.
Many people are saying things I consider dangerously close to “Let’s toss industry and technology out of the window!”. Dagon suggested that our current resource expenditure was reckless, and that we should substantially downgrade our resource expenditures. I consider this to be a seriously questionable perspective on the problem.
I’m not arguing against preserving the environment if it would boost the economy for at least the next 100 years, keeping in mind opportunity cost. I want to improve humanity’s generalized power to pursue its goals- I’m not attached to any particular short guiding principle for doing this, such as “Protect the Earth!” or “More oil!”. I don’t have Mad Oil Baron Syndrome.
Understood, I apologise for misunderstanding your position on fossils fuels. I feel there was a specific attempt from my side to interpret it with that meaning, even if the example used didn’t necessarily implied it was something you endorse, and that it was due to a negative gut reaction I had while reading what you wrote.
We seem to agree on the general principles that humanity technological level will not stay the same for the next hundred years, and that some level of the changes we are producing on the environment are to be avoided to improve mankind future’s condition.
I do feel that allowing the actions of humanity to destroy every part of the environment that hasn’t been proved useful is an engagement in an extremely reckless form of optimism, though.
It’s certainly part of the attitude that got us to the point where being careful with our effect on current temperature levels and avoiding to loose most of our water resources has become a pretty difficult global challenge.
From what I read on industrial regulations so far, in most nations pollutants functionally have to be proven harmful before it can be considered forbidding their release in the environment, and I’m 100% sure it’s at least the current approach in the country most users from this site are.
All in all, our species is nowhere near the point to be immune from the feedbacks our environment can throw at us. By our actions, one third of current animal and vegetable species are currently going extinct.
That is one huge Chesterton Fence we’re tearing down. We simply don’t know in how many way such a change on the system we’re living in can go wrong for us.
I’d agree that the greatest “currently existing risks to my survival” are natural causes. I intend this category as “risks that are actively killing people who are living in similar conditions to my own now”.
However, if we talk about the main “future risks to my survival”, as in “risks that currently are killing a low number of people similar to me, but that could kill a lot more in future years in which I’ll be alive” then I feel that, even if AI mismanagement takes first place, climate change takes the second, and that it augments considerably the chances of the first.
While riots and food shortages are indeed examples I choose by pure gut level evaluation of “scariness” and are too specific to me to put my money on if I should bet on the causes of my death, I don’t feel at all optimistic about the way our society will react to climate change.
Migratory fluxes and violent conflicts a lot smaller than what we’ll certainly see were enough to send the European Union dangerously close to falling apart in a bunch of nationalistic states. Change enough our environment, and wide-scale wars and a new wave of totalitarian governments stop to be an unlikely reality, since in times of fear and unrest people are more likely to regard the principles behind them as positive. All these factors seem to reinforce each other as far as I know.
Even by assuming the situation won’t go as bad as total warfare and rampant totalitarianism, I would bet on a significant degeneration in the political scenario, moving away from international cooperation and toward nationalism and short term interests considerations only, and I don’t really see any reason that a bunch of such states, that are fighting for resources, facing wide scale crisis, scared of what each other will do and have lost most of their ability to cooperate with each other are less likely to botch AI horribly and kill us all.
About the suggestions for lowering our resource consumptions since it’s currently too high: it’s unarguable that we are burning through a ridiculous amount of resources that are producing practically no improvement in our chances of survival or even marginally improving the quality of our life. We could easily keep the same amount of comforts and life expectancy while consuming a lot less resources.
Our economical system has simply not enough incentives for efficiency, shrinking our resources consumption without sacrificing quality of life and life expectancy is perfectly doable and it’s imperative to augment our chances of long term survival.
Lastly, given the current trend of society, statements close to “keeping in check mankind consumption of resources and it’s impact on the environment it’s not a priority” are a lot more dangerous than statements close to “let’s toss industry out of the windows and go back to the caves”. Clearly going too far in either of those directions would hurt, but going too much in the first direction is a possibility a lot more likely at the present moment, while I don’t see any real chance for the second kind of statements to change society toward a pre-technological or pre-industrial site.
The de-growth movement (which, if I remember correctly, it’s based on the proven fact that economic growth, after a certain threshold, offers basically no improvement to quality of life, and that first world has long passed that threshold, so we should focus on things that aren’t economic growth), also doesn’t strike me as a threat to my quality of life or my long term survival comparable to underestimating the impact of environmental damages or of over-consumption of resources before the point when mankind hits a positive singularity.
I also don’t see any real chance of this site moving toward an anti-technological or anti-science trend. Those trends do seem dangerous and likely in the general populace, but for the risks I’ve stated above I think they should be opposed by informing people on the benefits of technology and science, rather than of the industrial system.
Indeed, there is an active “degrowth” movement. cf. Giorgos Kallis: https://greattransition.org/publication/the-degrowth-alternative