Ok. That seems like a reasonable argument. So how much of a reduction is warranted may be up in the air then. There’s also a serious denotative v. connotative issue here, since one needs to carefully distinguish the actual statement “Climate scientists are likely overconfident just as almost everyone is” with all the statements made doubting climate science, anthropogenic global warming, etc. If you are only talking about a drop from .99 to .95 (or even from say .99 to .9) that isn’t going to impact policy considerations much.
If you are only talking about a drop from .99 to .95 (or even from say .99 to .9) that isn’t going to impact policy considerations much.
I think it matters when it comes to geoengineering policy making. If the policy community thinks that climate scientists are really good at predicting climate, I think there a good chance that they will go sooner or later for geoengineering.
If we want to stay alive in the next century it would be good if policy makers can distinguish events with 0.9, 0.99 and 0.999 certainty.
Even a 0.001 chance that a given asteroid will extinguish humanity is too high. It’s valuable to keep in mind that small chances happen from time to time and that you have to do scenario planning that integrates them.
Sure, but right now, almost no one is talking about geoengineering as a serious solution. The policy focus right now is much more on carbon dioxide production reduction. So in the context of where these discussions are occurring, these differences will matter. Right now, I’d focus much more on getting policy makers to be able to reliably distinguish something like 0.9 from something like 0.1. In this particular issue, even that is apparently difficult. Getting an ability to appreciate an extremely rough estimate is a much higher policy priority.
Sure, but right now, almost no one is talking about geoengineering as a serious solution.
That a very poor perspective when you care about existential risk. Memes do have effects 10 or 20 years down the road.
It bad to say things that are clearly false like that the evidence for climate change is comparable to that for evolution. Evolution being true is something with much better evidence than p=0.999.
The point of Lesswrong isn’t to focus on ideas with short-term considerations. It’s rather to focus on finding methods to think rationally about issues. It’s about letting a lot of people in their twenties learn those methods. Than when those smart people are in positions of authority when they are in their thirties or forties, you get a payoff.
If scientists lie to the world to get policy makers to make good short-term policy decision that expensive over the long term. Scientists shouldn’t orient themselves towards short-term decision making but focus on staying with the truth.
Ok. That seems like a reasonable argument. So how much of a reduction is warranted may be up in the air then. There’s also a serious denotative v. connotative issue here, since one needs to carefully distinguish the actual statement “Climate scientists are likely overconfident just as almost everyone is” with all the statements made doubting climate science, anthropogenic global warming, etc. If you are only talking about a drop from .99 to .95 (or even from say .99 to .9) that isn’t going to impact policy considerations much.
I think it matters when it comes to geoengineering policy making. If the policy community thinks that climate scientists are really good at predicting climate, I think there a good chance that they will go sooner or later for geoengineering.
If we want to stay alive in the next century it would be good if policy makers can distinguish events with 0.9, 0.99 and 0.999 certainty.
Even a 0.001 chance that a given asteroid will extinguish humanity is too high. It’s valuable to keep in mind that small chances happen from time to time and that you have to do scenario planning that integrates them.
Sure, but right now, almost no one is talking about geoengineering as a serious solution. The policy focus right now is much more on carbon dioxide production reduction. So in the context of where these discussions are occurring, these differences will matter. Right now, I’d focus much more on getting policy makers to be able to reliably distinguish something like 0.9 from something like 0.1. In this particular issue, even that is apparently difficult. Getting an ability to appreciate an extremely rough estimate is a much higher policy priority.
That a very poor perspective when you care about existential risk. Memes do have effects 10 or 20 years down the road.
It bad to say things that are clearly false like that the evidence for climate change is comparable to that for evolution. Evolution being true is something with much better evidence than p=0.999.
The point of Lesswrong isn’t to focus on ideas with short-term considerations. It’s rather to focus on finding methods to think rationally about issues. It’s about letting a lot of people in their twenties learn those methods. Than when those smart people are in positions of authority when they are in their thirties or forties, you get a payoff.
If scientists lie to the world to get policy makers to make good short-term policy decision that expensive over the long term. Scientists shouldn’t orient themselves towards short-term decision making but focus on staying with the truth.