We triggered some other kind of apocalypse—nuclear war, bioweapons, something like that—and it was enough to roll back progress but not wipe out humanity. With the delay and abrupt shifts, people managed to come up with something better than what we have now. The “AI arms race” requires significant infrastructure to be economically viable, and the classic post-apocalypse scenario doesn’t exactly involve training neural networks on supercomputers.
Maybe people had more time (and 0 regulations) for genetic experiments and eugenics (which are simpler than supercomputers even in a post-apocalyptic world), or they realized the destructiveness of Moloch and learned to coordinate (hahaha), or something else entirely.
We triggered some other kind of apocalypse—nuclear war, bioweapons, something like that—and it was enough to roll back progress but not wipe out humanity. With the delay and abrupt shifts, people managed to come up with something better than what we have now. The “AI arms race” requires significant infrastructure to be economically viable, and the classic post-apocalypse scenario doesn’t exactly involve training neural networks on supercomputers.
Maybe people had more time (and 0 regulations) for genetic experiments and eugenics (which are simpler than supercomputers even in a post-apocalyptic world), or they realized the destructiveness of Moloch and learned to coordinate (hahaha), or something else entirely.