Your initial suggestion, “launch nukes at every semiconductor fab”, is not workable.
In what way is it not workable? Perhaps we have different intuitions about how difficult it is to build a cutting-edge semiconductor facility? Alternatively you may disagree with me that AI is largely hardware-bound and thus cutting off the supply of new compute will also prevent the rise of superhuman AI?
Do you also think that “the US president launches every nuclear weapon at his command, causing nuclear winter?” would fail to prevent the rise of superhuman AGI?
In what way is it not workable? Perhaps we have different intuitions about how difficult it is to build a cutting-edge semiconductor facility? Alternatively you may disagree with me that AI is largely hardware-bound and thus cutting off the supply of new compute will also prevent the rise of superhuman AI?
Do you also think that “the US president launches every nuclear weapon at his command, causing nuclear winter?” would fail to prevent the rise of superhuman AGI?