This seems like exactly the set-up Bostrom has in mind when he talks about existential risks. We have a small chance of colonising the galaxy and beyond, but this carries a lot of our expected value. An event which prevents that would be a catastrophe.
Of course many of the catastrophes that are discussed (e.g. most life is wiped out by a comet striking the earth) coincide with drastically reducing the observed value in the short term. But we normally want to include getting stuck on a trajectory which stops further progress, even if it will be a future which involves good lives for billions of people.
This seems like exactly the set-up Bostrom has in mind when he talks about existential risks. We have a small chance of colonising the galaxy and beyond, but this carries a lot of our expected value. An event which prevents that would be a catastrophe.
Of course many of the catastrophes that are discussed (e.g. most life is wiped out by a comet striking the earth) coincide with drastically reducing the observed value in the short term. But we normally want to include getting stuck on a trajectory which stops further progress, even if it will be a future which involves good lives for billions of people.