People may be biased towards thinking that the narrow slice of time they live in is the most important period in history, but statistically this is unlikely.
If people think that something will cause the apocalypse or bring about a utopian society, historically speaking they are likely to be wrong.
Part of the problem with these two is that whether an apocalypse happens or not often depends on whether people took the risk of it happening seriously. We absolutely, could have had a nuclear holocaust in the 70′s and 80′s; one of the reasons we didn’t is because people took it seriously and took steps to avert it.
And, of course, whether a time slice is the most important in history, in retrospect, will depend on whether you actually had an apocalypse. The 70′s would have seemed a lot more momentous if we had launched all of our nuclear warheads at each other.
For my part, my bet would be on something like:
O. Early applications of AI/AGI drastically increase human civilization’s sanity and coordination ability; enabling humanity to solve alignment, or slow down further descent into AGI, etc. (Not in principle mutex with all other answers.)
But more specifically:
P. Red-teams evaluating early AGIs demonstrate the risks of non-alignment in a very vivid way; they demonstrate, in simulation, dozens of ways in which the AGI would try to destroy humanity. This has an effect on world leaders similar to observing nuclear testing: It scares everyone into realizing the risk, and everyone stops improving AGI’s capabilities until they’ve figured out how to keep it from killing everyone.
Sorry—that was my first post on this forum, and I couldn’t figure out the editor. I didn’t actually click “submit”, but accidentally hit a key combo that it interpreted as “submit”.
I’ve edited it now with what I was trying to get at in the first place.
Part of the problem with these two is that whether an apocalypse happens or not often depends on whether people took the risk of it happening seriously. We absolutely, could have had a nuclear holocaust in the 70′s and 80′s; one of the reasons we didn’t is because people took it seriously and took steps to avert it.
And, of course, whether a time slice is the most important in history, in retrospect, will depend on whether you actually had an apocalypse. The 70′s would have seemed a lot more momentous if we had launched all of our nuclear warheads at each other.
For my part, my bet would be on something like:
But more specifically:
P. Red-teams evaluating early AGIs demonstrate the risks of non-alignment in a very vivid way; they demonstrate, in simulation, dozens of ways in which the AGI would try to destroy humanity. This has an effect on world leaders similar to observing nuclear testing: It scares everyone into realizing the risk, and everyone stops improving AGI’s capabilities until they’ve figured out how to keep it from killing everyone.
What, exactly is this comment intended to say?
Sorry—that was my first post on this forum, and I couldn’t figure out the editor. I didn’t actually click “submit”, but accidentally hit a key combo that it interpreted as “submit”.
I’ve edited it now with what I was trying to get at in the first place.