What does seem obvious is that from a utilitarian perspective society is hugely underinvesting in risk-fighting and everything else with permanent effects.
That’s not obvious to me, and even if it were I don’t take a utilitarian perspective.
If you think there is underinvestment in risk fighting you have to come up with arguments to persuade people that don’t rely on a utilitarian perspective since most people don’t take that perspective when making decisions. Or you can try and find ways of increasing investment that don’t rely on persuading large numbers of people.
That utilitarianism implies one should do things with permanent effects comes from the future being much bigger than the present, and the probability of affecting it being smaller but not nearly proportionally smaller.
Even granting that, it’s not obvious to me that society is underinvesting in risk fighting. Many of the suggestions for countering global warming for example imply reduced economic growth. It is not obvious to me that the risks of catastrophic global warming outweigh the expected losses from reduced growth from a utilitarian perspective. Any investment in risk fighting carries an opportunity cost in a foregone investment in some other area. The right choice from a utilitarian perspective depends on judgements of expected risk vs. the expected benefits of alternative courses of action. I think the best choices are far from obvious.
Wholly agree on global warming; the best reference I know of on extreme predictions is this. I’m thinking more of future technologies (the self-replicating and/or intelligent kind), but also of building up the general intellectual background and institutions to deal rationally with unknown unknowns.
That’s not obvious to me, and even if it were I don’t take a utilitarian perspective.
If you think there is underinvestment in risk fighting you have to come up with arguments to persuade people that don’t rely on a utilitarian perspective since most people don’t take that perspective when making decisions. Or you can try and find ways of increasing investment that don’t rely on persuading large numbers of people.
That utilitarianism implies one should do things with permanent effects comes from the future being much bigger than the present, and the probability of affecting it being smaller but not nearly proportionally smaller.
I agree with your second paragraph.
Even granting that, it’s not obvious to me that society is underinvesting in risk fighting. Many of the suggestions for countering global warming for example imply reduced economic growth. It is not obvious to me that the risks of catastrophic global warming outweigh the expected losses from reduced growth from a utilitarian perspective. Any investment in risk fighting carries an opportunity cost in a foregone investment in some other area. The right choice from a utilitarian perspective depends on judgements of expected risk vs. the expected benefits of alternative courses of action. I think the best choices are far from obvious.
Wholly agree on global warming; the best reference I know of on extreme predictions is this. I’m thinking more of future technologies (the self-replicating and/or intelligent kind), but also of building up the general intellectual background and institutions to deal rationally with unknown unknowns.