Why is a moratorium bad in that case?
For reference, I disagree with the moratorium slightly, but for different reasons.
If we stuck on the level of dangerous Tools and assuming that superintelligence will not kill us based on some long-term complex reasoning, e.g. small chance that it is in a testing simulation.
Why is a moratorium bad in that case?
For reference, I disagree with the moratorium slightly, but for different reasons.
If we stuck on the level of dangerous Tools and assuming that superintelligence will not kill us based on some long-term complex reasoning, e.g. small chance that it is in a testing simulation.