This solves nothing that could not be better solved by freezing development of hardware, which would also slow down evolutionary setups.
This also allows for more time for safer approaches such as genetic engineering and biological advancements to catch up, and keep us from Killing Everyone.
If your argument is that a race of genetically engineered super-humans are less likely to cause human extinction than GPT-5, the Neanderthals would like to have a word with you.
Manifold currently estimates that there is a 4% chance GPT-5 will destroy the world. What percent chance do you estimate there is that a genetically engineered race of super-humans will cause human-extinction?
This solves nothing that could not be better solved by freezing development of hardware, which would also slow down evolutionary setups.
This also allows for more time for safer approaches such as genetic engineering and biological advancements to catch up, and keep us from Killing Everyone.
If your argument is that a race of genetically engineered super-humans are less likely to cause human extinction than GPT-5, the Neanderthals would like to have a word with you.
But notably, we have not killed all biological life and we are substantially Neanderthal. Versus death by AI, its a far better prospect.
Manifold currently estimates that there is a 4% chance GPT-5 will destroy the world. What percent chance do you estimate there is that a genetically engineered race of super-humans will cause human-extinction?