Before and during the event, there was a high probability P of humanity going extinct. That is equivalent to the loss then of P proportion of all future expected value. Expected value is always about the future; that’s why it’s not actual value.
(Also, I think on some many-worlds theories utility was actually lost due to humanity surviving in less measure.)
Can you name any past existential (or nearly so) catastrophies?
Toba. Approx. 1000 breeding pairs of humans survived.
Did the event “cause the loss of most expected value”? Looking around, I’m not so sure.
It’s a good example of extinction risk, but doesn’t seem to fit the (iii) definition well.
Before and during the event, there was a high probability P of humanity going extinct. That is equivalent to the loss then of P proportion of all future expected value. Expected value is always about the future; that’s why it’s not actual value.
(Also, I think on some many-worlds theories utility was actually lost due to humanity surviving in less measure.)
Looking from before the event, true. Fair point.