I think you haven’t really responded to Dagon’s key point here:
“what else do you have to make decisions with?”
You express concern about Caplan underestimating the importance of climate change. What if I think the risk of the Large Hadron Collider collapsing the false vacuum is a much bigger deal, and that any resources currently going to reduce or mitigate climate change should instead go to preventing false vacuum collapse. Both concerns have lots of unknown unknowns. On what grounds would you convince me—or a decisionmaker controlling large amounts of money—to focus on climate change instead? Presumably you think the likelihood of catastrophic climate change is higher—on what basis?
Probabilistic models may get weaker as we move toward deeper uncertainty, but they’re what we’ve got, and we’ve got to choose how to direct resources somehow. Even under level 3 uncertainty, we don’t always have the luxury of seeing a course of action that would be better in all scenarios (eg I think we clearly don’t in my example—if we’re in the climate-change-is-higher-risk scenario, we should put most resources toward that; if we’re in the vacuum-collapse-is-higher-risk scenario, we should put our resources there instead.
I think you haven’t really responded to Dagon’s key point here:
You express concern about Caplan underestimating the importance of climate change. What if I think the risk of the Large Hadron Collider collapsing the false vacuum is a much bigger deal, and that any resources currently going to reduce or mitigate climate change should instead go to preventing false vacuum collapse. Both concerns have lots of unknown unknowns. On what grounds would you convince me—or a decisionmaker controlling large amounts of money—to focus on climate change instead? Presumably you think the likelihood of catastrophic climate change is higher—on what basis?
Probabilistic models may get weaker as we move toward deeper uncertainty, but they’re what we’ve got, and we’ve got to choose how to direct resources somehow. Even under level 3 uncertainty, we don’t always have the luxury of seeing a course of action that would be better in all scenarios (eg I think we clearly don’t in my example—if we’re in the climate-change-is-higher-risk scenario, we should put most resources toward that; if we’re in the vacuum-collapse-is-higher-risk scenario, we should put our resources there instead.