E.g. AI regulation (like most technology regulation) is only effective if you get the whole world on board, and without global coordination there’s the potential for arms races.
“Only develop an FAI” also presumes a hard takeoff, and it’s not exactly established beyond all doubt that we’ll have one.
E.g. AI regulation (like most technology regulation) is only effective if you get the whole world on board, and without global coordination there’s the potential for arms races.
“Only develop an FAI” also presumes a hard takeoff, and it’s not exactly established beyond all doubt that we’ll have one.