Stagnation → reduced existential risk is not a well-established fact. Our ability to cope with technologies that would be more delayed in stagnation can go down as well as up, stagnation will not affect all technologies equally and can skew the distribution worse, and there are other risks in the meantime. Not to mention that one ought to be very cautious about framing progress as a problem when there is so much chance of a false-positive, as opposed to framing things as an investigative/research question.
might support a “Manhattan project” that dumps a trillion dollars into a scientific goal, increasing the risk of UFAI.
Why think that this increases the risk of UFAI, relative to the expected distribution of development in industry, academia, or nonprofits absent such a project?
Stagnation → reduced existential risk is not a well-established fact. Our ability to cope with technologies that would be more delayed in stagnation can go down as well as up, stagnation will not affect all technologies equally and can skew the distribution worse, and there are other risks in the meantime. Not to mention that one ought to be very cautious about framing progress as a problem when there is so much chance of a false-positive, as opposed to framing things as an investigative/research question.
Why think that this increases the risk of UFAI, relative to the expected distribution of development in industry, academia, or nonprofits absent such a project?