Ah, OK, I didn’t read carefully enough: you specified that somehow “solving” climate change would reduce Pr(extinction due to nuclear winter) by 9%. I agree that in that case you’re right. But now that I understand better what scenario you’re proposing it seems like a really weird scenario to propose, because I can’t imagine what sort of real-world “solution” to climate change would have that property. Maybe the discovery of some sort of weather magic that enables us to adjust weather and climate arbitrarily would do it, but the actual things we might do that would help with climate change are all more specific and limited than that, and e.g. scarcely anything that reduces danger from global warming would help much with nuclear winter.
So I’m not sure how this (to my mind super-improbable) hypothetical scenario, where work on climate change would somehow address nuclear winter along with global warming, tells us anything about the actual world we live in, where surely that wouldn’t be the case.
But now that I understand better what scenario you’re proposing it seems like a really weird scenario to propose, because I can’t imagine what sort of real-world “solution” to climate change would have that property. Maybe the discovery of some sort of weather magic that enables us to adjust weather and climate arbitrarily would do it
I think the story of how mitigating climate change reduces risk of first-order effects from nuclear war is not that it helps survive nuclear winter, but that climate change leads to things like refugee crises, which in turn lead to worse international relations and higher chance of nuclear weapons being used, and hence mitigating c/c leads to lower chances of nuclear winter occurring.
The 1%/9% numbers were meant to illustrate the principle and not to be realistic, but if you told me something like, there’s a 0.5% contribution to x-risk from c/c via first-order effects, and there’s a total of 5% contribution to x-risk from c/c via increased risk from AI, bio-terrorism, and nuclear winter (all of which plausibly suffer from political instabilities), that doesn’t sound obviously unreasonable to me.
The concrete claims I’m defending are that
insofar as they exist, n-th order contributions to x-risk matter roughly as much as first-order contributions; and
it’s not obvious that they don’t exist or are not negligible.
I think those are all you need to see that the single-category framing is the correct one.
OK, so it turns out I misunderstood your example in two different ways, making (in addition to the error discussed above) the rookie mistake of assuming that when you gave nuclear war leading to nuclear winter (which surely is a variety of anthropogenic climate change) the latter was the “climate change” you meant. Oh well.
So, I do agree that if climate change contributes to existential risk indirectly in that sort of way (but we’re still talking about the same kind of climate change as we might worry about the direct effects of) then yes, that should go in the same accounting bucket as the direct effects. Yay, agreement.
(And I think we also agree that cases where other things such as nuclear war produce other kinds of climate change should not go in the same accounting bucket, even though in some sense they involve climate change.)
So, I do agree that if climate change contributes to existential risk indirectly in that sort of way (but we’re still talking about the same kind of climate change as we might worry about the direct effects of) then yes, that should go in the same accounting bucket as the direct effects. Yay, agreement.
(And I think we also agree that cases where other things such as nuclear war produce other kinds of climate change should not go in the same accounting bucket, even though in some sense they involve climate change.)
Yes on both.
This conversation is sort of interesting on a meta level. Turns out there were two ways my example was confusing, and neither of them occurred to me when I wrote it. Apologies for that.
I’m not sure if there’s a lesson here. Maybe something like ‘the difficulty of communicating something isn’t strictly tied to how simple the point seems to you’ (because this was kind of the issue; I thought what I was saying was simple hence easy to understand hence there was no need to think much about what examples to use). Or maybe just always think for a minimum amount of time since one tends to underestimate the difficulty of conversation in general. In retrospect, it sure seems stupid to use nuclear winter as an example for a second-order effect of climate change, when the fact that winter and climate are connected is totally coincidental.
It’s somewhat consoling that we at least managed to resolve one misunderstanding per back-and-forth message pair.
Ah, OK, I didn’t read carefully enough: you specified that somehow “solving” climate change would reduce Pr(extinction due to nuclear winter) by 9%. I agree that in that case you’re right. But now that I understand better what scenario you’re proposing it seems like a really weird scenario to propose, because I can’t imagine what sort of real-world “solution” to climate change would have that property. Maybe the discovery of some sort of weather magic that enables us to adjust weather and climate arbitrarily would do it, but the actual things we might do that would help with climate change are all more specific and limited than that, and e.g. scarcely anything that reduces danger from global warming would help much with nuclear winter.
So I’m not sure how this (to my mind super-improbable) hypothetical scenario, where work on climate change would somehow address nuclear winter along with global warming, tells us anything about the actual world we live in, where surely that wouldn’t be the case.
Am I still missing something important?
I think the story of how mitigating climate change reduces risk of first-order effects from nuclear war is not that it helps survive nuclear winter, but that climate change leads to things like refugee crises, which in turn lead to worse international relations and higher chance of nuclear weapons being used, and hence mitigating c/c leads to lower chances of nuclear winter occurring.
The 1%/9% numbers were meant to illustrate the principle and not to be realistic, but if you told me something like, there’s a 0.5% contribution to x-risk from c/c via first-order effects, and there’s a total of 5% contribution to x-risk from c/c via increased risk from AI, bio-terrorism, and nuclear winter (all of which plausibly suffer from political instabilities), that doesn’t sound obviously unreasonable to me.
The concrete claims I’m defending are that
insofar as they exist, n-th order contributions to x-risk matter roughly as much as first-order contributions; and
it’s not obvious that they don’t exist or are not negligible.
I think those are all you need to see that the single-category framing is the correct one.
OK, so it turns out I misunderstood your example in two different ways, making (in addition to the error discussed above) the rookie mistake of assuming that when you gave nuclear war leading to nuclear winter (which surely is a variety of anthropogenic climate change) the latter was the “climate change” you meant. Oh well.
So, I do agree that if climate change contributes to existential risk indirectly in that sort of way (but we’re still talking about the same kind of climate change as we might worry about the direct effects of) then yes, that should go in the same accounting bucket as the direct effects. Yay, agreement.
(And I think we also agree that cases where other things such as nuclear war produce other kinds of climate change should not go in the same accounting bucket, even though in some sense they involve climate change.)
Yes on both.
This conversation is sort of interesting on a meta level. Turns out there were two ways my example was confusing, and neither of them occurred to me when I wrote it. Apologies for that.
I’m not sure if there’s a lesson here. Maybe something like ‘the difficulty of communicating something isn’t strictly tied to how simple the point seems to you’ (because this was kind of the issue; I thought what I was saying was simple hence easy to understand hence there was no need to think much about what examples to use). Or maybe just always think for a minimum amount of time since one tends to underestimate the difficulty of conversation in general. In retrospect, it sure seems stupid to use nuclear winter as an example for a second-order effect of climate change, when the fact that winter and climate are connected is totally coincidental.
It’s somewhat consoling that we at least managed to resolve one misunderstanding per back-and-forth message pair.