Perhaps a nuclear war today would reduce the possibility of human extinction within this century. It appears that AGI is close, without substantial progress in AI Safety. A nuclear war, would, I believe cause a major slowdown in AI progress, increasing the probability of getting an aligned AI at the end.
Perhaps a nuclear war today would reduce the possibility of human extinction within this century. It appears that AGI is close, without substantial progress in AI Safety. A nuclear war, would, I believe cause a major slowdown in AI progress, increasing the probability of getting an aligned AI at the end.
I enjoyed reading this silver-lining comment. :)
Honest opinion