I’ve heard arguments for and against “it might turn out to be too hard the second time around”. I think overall that it’s more likely than not that we would eventually succeed in rebuilding a technological society, but that’s the strongest I could put it, ie it’s very plausible that we would never do so.
If enough of our existing thinking survives, the thinking time that rebuilding civilization would give us might move things a little in our favour WRT AI++, MNT etc. I don’t know which side does better on this tradeoff. However I seriously doubt that trying to bring about the collapse of civilization is the most efficient way to mitigate existential risk.
Also, and I hate to be this selfish about it but there it is, if civilization ends I definitely die either way, and I’d kind of prefer not to.
I’ve heard arguments for and against “it might turn out to be too hard the second time around”. I think overall that it’s more likely than not that we would eventually succeed in rebuilding a technological society, but that’s the strongest I could put it, ie it’s very plausible that we would never do so.
If enough of our existing thinking survives, the thinking time that rebuilding civilization would give us might move things a little in our favour WRT AI++, MNT etc. I don’t know which side does better on this tradeoff. However I seriously doubt that trying to bring about the collapse of civilization is the most efficient way to mitigate existential risk.
Also, and I hate to be this selfish about it but there it is, if civilization ends I definitely die either way, and I’d kind of prefer not to.