Moral uncertainty aside, sacrificing 2000 years of near subsistence-level existence of billions of humans seems like a fair price to trade for even a percentage point higher chance of achieving utopia for many orders of magnitude more sentient beings for billion of years (or avoiding S-risks, etc). And right now I think that (conditional upon success large enough to change the technological curve) this plan will increase the odds of an existential win by multiple percentage points.
Fair enough, you’ve convinced me (moral uncertainty aside).
I was anchoring too much to minimizing downside risk.
Fair enough, you’ve convinced me (moral uncertainty aside).
I was anchoring too much to minimizing downside risk.