OpenAI already have this in their charter:
We are concerned about late-stage AGI development becoming a competitive race without time for adequate safety precautions. Therefore, if a value-aligned, safety-conscious project comes close to building AGI before we do, we commit to stop competing with and start assisting this project. We will work out specifics in case-by-case agreements, but a typical triggering condition might be “a better-than-even chance of success in the next two years.”
This post helped me notice I have incoherent beliefs:
“If MAGMA self-destructs, the other labs would look at it with confusion/pity and keep going. That’s not a plan”
“MAGMA should self-destruct now even if it’s not leading!”
I think I’ve been avoiding thinking about this.
So what do I actually expect?
If OpenAI (currently in the lead) would say “our AI did something extremely dangerous, this isn’t something we know how to contain, we are shutting down and are calling other labs NOT to train over [amount of compute], and are not discussing the algorithm publicly because of fear the open source community will do this dangerous thing, and we need the government ASAP”, do I expect that to help?
Maybe?
Probably nation states will steal all the models+algorithm+slack as quickly as they can, probably a huge open source movement will protest, but it still sounds possible (15%?) that the major important actors would listen to this, especially if it was accompanies by demos or so?
What if Anthropic or xAI or DeepSeek (not currently in the lead) would shut down now?
...I think they would be ignored.
Does that imply I should help advance the capabilities of the lab most likely to act as you suggest?
Does this imply I should become a major player myself, if I can? If so, should I write on my website that I’m open to a coordinated pause?
Should I give up on being a CooperateBot, given the other players have made it so overwhelmingly clear they are happy to defect?
This is painful to think about, and I’m not sure what’s the right thing to do here.
Open to ideas from anyone.
Anyway, great post, thanks