That utilitarianism implies one should do things with permanent effects comes from the future being much bigger than the present, and the probability of affecting it being smaller but not nearly proportionally smaller.
Even granting that, it’s not obvious to me that society is underinvesting in risk fighting. Many of the suggestions for countering global warming for example imply reduced economic growth. It is not obvious to me that the risks of catastrophic global warming outweigh the expected losses from reduced growth from a utilitarian perspective. Any investment in risk fighting carries an opportunity cost in a foregone investment in some other area. The right choice from a utilitarian perspective depends on judgements of expected risk vs. the expected benefits of alternative courses of action. I think the best choices are far from obvious.
Wholly agree on global warming; the best reference I know of on extreme predictions is this. I’m thinking more of future technologies (the self-replicating and/or intelligent kind), but also of building up the general intellectual background and institutions to deal rationally with unknown unknowns.
That utilitarianism implies one should do things with permanent effects comes from the future being much bigger than the present, and the probability of affecting it being smaller but not nearly proportionally smaller.
I agree with your second paragraph.
Even granting that, it’s not obvious to me that society is underinvesting in risk fighting. Many of the suggestions for countering global warming for example imply reduced economic growth. It is not obvious to me that the risks of catastrophic global warming outweigh the expected losses from reduced growth from a utilitarian perspective. Any investment in risk fighting carries an opportunity cost in a foregone investment in some other area. The right choice from a utilitarian perspective depends on judgements of expected risk vs. the expected benefits of alternative courses of action. I think the best choices are far from obvious.
Wholly agree on global warming; the best reference I know of on extreme predictions is this. I’m thinking more of future technologies (the self-replicating and/or intelligent kind), but also of building up the general intellectual background and institutions to deal rationally with unknown unknowns.