People mostly seem to agree that we haven’t done a good job coordinating around them
They seem a lot easier to coordinate around
Also, not a reason, but:
AI seems likely to be weaponized, and warfare (whether conventional or not) seems like one of the areas where we should be most worried about “unbridled competition” creating a race-to-the-bottom on safety.
Disagree that nukes seem easier to coordinate around—there are factors that suggest this (e.g. easier to track who is and isn’t making nukes), but there are factors against as well (the incentives to “beat the other team” don’t seem nearly as strong).
incentives to “beat the other team” don’t seem nearly as strong
You mean it’s stronger for nukes than for AI? I think I disagree, but it’s a bit nuanced. It seems to me (as someone very ignorant about nukes) like with current nuclear tech you hit diminishing returns pretty fast, but I don’t expect that to be the case for AI.
Also, I’m curious if weaponization of AI is a crux for us.
For me it’s because:
Nukes seem like an obvious Xrisk
People mostly seem to agree that we haven’t done a good job coordinating around them
They seem a lot easier to coordinate around
Also, not a reason, but:
AI seems likely to be weaponized, and warfare (whether conventional or not) seems like one of the areas where we should be most worried about “unbridled competition” creating a race-to-the-bottom on safety.
TBC, I think climate change is probably an even better analogy.
And I also like to talk about international regulation, in general, like with tax havens.
Agree that climate change is a better analogy.
Disagree that nukes seem easier to coordinate around—there are factors that suggest this (e.g. easier to track who is and isn’t making nukes), but there are factors against as well (the incentives to “beat the other team” don’t seem nearly as strong).
You mean it’s stronger for nukes than for AI? I think I disagree, but it’s a bit nuanced. It seems to me (as someone very ignorant about nukes) like with current nuclear tech you hit diminishing returns pretty fast, but I don’t expect that to be the case for AI.
Also, I’m curious if weaponization of AI is a crux for us.
I’m uncertain about weaponization of AI (and did say “if we ignore military applications” in the OP).
Oops, missed that, sry.