My opinion is that the ship has already sailed; AI timelines are too short & the path to AGI too ‘no agency required’ that even a significant decrease in agency worldwide would not really buy us much time at all, if any.
The mechanistic theory behind “fight fire with fire” is all the usual stories for how we can avoid AGI doom by e.g. doing alignment research, governance/coordination, etc.
My opinion is that the ship has already sailed; AI timelines are too short & the path to AGI too ‘no agency required’ that even a significant decrease in agency worldwide would not really buy us much time at all, if any.
The mechanistic theory behind “fight fire with fire” is all the usual stories for how we can avoid AGI doom by e.g. doing alignment research, governance/coordination, etc.