This still feels more analogous to Chernobyl? “The other team is going to get cheap nuclear energy first if we don’t, and we prefer a nuclear accident to losing, so we might as well push ahead.”
You might argue that obviously it doesn’t matter very much who gets nuclear energy first, so this wouldn’t apply. I’d respond that the benefit : cost ratio here seems similar to the benefit : cost ratio for AI where the benefit is “we build a singleton” and the cost is “misaligned AGI causes extinction”. Surely it’s significantly better for the other team to win and build a singleton than for you to build a misaligned AGI?
(Separately, I think I would argue that the “we build a singleton” case is unlikely, but that’s not a crucial part of this argument.)
This still feels more analogous to Chernobyl? “The other team is going to get cheap nuclear energy first if we don’t, and we prefer a nuclear accident to losing, so we might as well push ahead.”
You might argue that obviously it doesn’t matter very much who gets nuclear energy first, so this wouldn’t apply. I’d respond that the benefit : cost ratio here seems similar to the benefit : cost ratio for AI where the benefit is “we build a singleton” and the cost is “misaligned AGI causes extinction”. Surely it’s significantly better for the other team to win and build a singleton than for you to build a misaligned AGI?
(Separately, I think I would argue that the “we build a singleton” case is unlikely, but that’s not a crucial part of this argument.)