I can imagine a plausible scenario in which WW3 is a great thing, because both sides brick each other’s datacenters and bomb each other’s semiconductor fabs. Also, all the tech talent will be spent trying to hack the other side and will not be spent training bigger and bigger language models.
That only gives you a brief delay on a timeline which could, depending on the horizons you adopt, be billions of years long. If you really wanted to reduce s-risk in an absolute sense, you’d have to try to sterilize the planet, not set back semiconductor manufacturing by a decade. This, I think, is a project which should give one pause.
I can imagine a plausible scenario in which WW3 is a great thing, because both sides brick each other’s datacenters and bomb each other’s semiconductor fabs. Also, all the tech talent will be spent trying to hack the other side and will not be spent training bigger and bigger language models.
I imagine that WW3 would be an incredibly strong pressure, akin to WW2, which causes governments to finally sit up and take notice of AI.
And then spend several trillion dollars running Manhattan Project Two: Manhattan Harder, racing each other to be the first to get AI.
And then we die even faster, and instead of being converted into paperclips, we’re converted into tiny American/Chinese flags
Missed opportunity to call it Manhattan Project Two: The Bronx.
That only gives you a brief delay on a timeline which could, depending on the horizons you adopt, be billions of years long. If you really wanted to reduce s-risk in an absolute sense, you’d have to try to sterilize the planet, not set back semiconductor manufacturing by a decade. This, I think, is a project which should give one pause.